Apr 21 10:42:53 user nova-compute[70954]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. Apr 21 10:42:56 user nova-compute[70954]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=70954) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} Apr 21 10:42:56 user nova-compute[70954]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=70954) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} Apr 21 10:42:56 user nova-compute[70954]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=70954) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} Apr 21 10:42:56 user nova-compute[70954]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs Apr 21 10:42:56 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:42:56 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.018s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:42:56 user nova-compute[70954]: INFO nova.virt.driver [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] Loading compute driver 'libvirt.LibvirtDriver' Apr 21 10:42:57 user nova-compute[70954]: INFO nova.compute.provider_config [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] Acquiring lock "singleton_lock" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] Acquired lock "singleton_lock" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] Releasing lock "singleton_lock" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] Full set of CONF: {{(pid=70954) _wait_for_exit_or_signal /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:362}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] ******************************************************************************** {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2589}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] Configuration options gathered from: {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2590}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] command line args: ['--config-file', '/etc/nova/nova-cpu.conf'] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2591}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] config files: ['/etc/nova/nova-cpu.conf'] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2592}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] ================================================================================ {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2594}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] allow_resize_to_same_host = True {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] arq_binding_timeout = 300 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] backdoor_port = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] backdoor_socket = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] block_device_allocate_retries = 300 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] block_device_allocate_retries_interval = 5 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cert = self.pem {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] compute_driver = libvirt.LibvirtDriver {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] compute_monitors = [] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] config_dir = [] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] config_drive_format = iso9660 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] config_file = ['/etc/nova/nova-cpu.conf'] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] config_source = [] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] console_host = user {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] control_exchange = nova {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cpu_allocation_ratio = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] daemon = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] debug = True {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] default_access_ip_network_name = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] default_availability_zone = nova {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] default_ephemeral_format = ext4 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] default_schedule_zone = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] disk_allocation_ratio = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] enable_new_services = True {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] enabled_apis = ['osapi_compute'] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] enabled_ssl_apis = [] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] flat_injected = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] force_config_drive = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] force_raw_images = True {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] graceful_shutdown_timeout = 5 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] heal_instance_info_cache_interval = 60 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] host = user {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] initial_cpu_allocation_ratio = 4.0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] initial_disk_allocation_ratio = 1.0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] initial_ram_allocation_ratio = 1.0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] instance_build_timeout = 0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] instance_delete_interval = 300 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] instance_format = [instance: %(uuid)s] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] instance_name_template = instance-%08x {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] instance_usage_audit = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] instance_usage_audit_period = month {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] instances_path = /opt/stack/data/nova/instances {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] internal_service_availability_zone = internal {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] key = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] live_migration_retry_count = 30 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] log_config_append = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] log_dir = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] log_file = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] log_options = True {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] log_rotate_interval = 1 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] log_rotate_interval_type = days {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] log_rotation_type = none {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] long_rpc_timeout = 1800 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] max_concurrent_builds = 10 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] max_concurrent_live_migrations = 1 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] max_concurrent_snapshots = 5 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] max_local_block_devices = 3 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] max_logfile_count = 30 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] max_logfile_size_mb = 200 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] maximum_instance_delete_attempts = 5 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] metadata_listen = 0.0.0.0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] metadata_listen_port = 8775 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] metadata_workers = 3 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] migrate_max_retries = -1 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] mkisofs_cmd = genisoimage {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] my_block_storage_ip = 10.0.0.210 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] my_ip = 10.0.0.210 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] network_allocate_retries = 0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] osapi_compute_listen = 0.0.0.0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] osapi_compute_listen_port = 8774 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] osapi_compute_unique_server_name_scope = {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] osapi_compute_workers = 3 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] password_length = 12 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] periodic_enable = True {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] periodic_fuzzy_delay = 60 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] pointer_model = ps2mouse {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] preallocate_images = none {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] publish_errors = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] pybasedir = /opt/stack/nova {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] ram_allocation_ratio = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] rate_limit_burst = 0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] rate_limit_except_level = CRITICAL {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] rate_limit_interval = 0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] reboot_timeout = 0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] reclaim_instance_interval = 0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] record = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] reimage_timeout_per_gb = 20 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] report_interval = 10 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] rescue_timeout = 0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] reserved_host_cpus = 0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] reserved_host_disk_mb = 0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] reserved_host_memory_mb = 512 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] reserved_huge_pages = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] resize_confirm_window = 0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] resize_fs_using_block_device = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] resume_guests_state_on_host_boot = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] rpc_response_timeout = 60 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] run_external_periodic_tasks = True {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] running_deleted_instance_action = reap {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] running_deleted_instance_poll_interval = 1800 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] running_deleted_instance_timeout = 0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] scheduler_instance_sync_interval = 120 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] service_down_time = 60 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] servicegroup_driver = db {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] shelved_offload_time = 0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] shelved_poll_interval = 3600 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] shutdown_timeout = 0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] source_is_ipv6 = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] ssl_only = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] state_path = /opt/stack/data/nova {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] sync_power_state_interval = 600 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] sync_power_state_pool_size = 1000 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] syslog_log_facility = LOG_USER {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] tempdir = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] timeout_nbd = 10 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] transport_url = **** {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] update_resources_interval = 0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] use_cow_images = True {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] use_eventlog = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] use_journal = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] use_json = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] use_rootwrap_daemon = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] use_stderr = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] use_syslog = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vcpu_pin_set = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vif_plugging_is_fatal = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vif_plugging_timeout = 0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] virt_mkfs = [] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] volume_usage_poll_interval = 0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] watch_log_file = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] web = /usr/share/spice-html5 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_concurrency.disable_process_locking = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_concurrency.lock_path = /opt/stack/data/nova {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_messaging_metrics.metrics_process_name = {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] api.auth_strategy = keystone {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] api.compute_link_prefix = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] api.dhcp_domain = novalocal {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] api.enable_instance_password = True {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] api.glance_link_prefix = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] api.instance_list_cells_batch_strategy = distributed {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] api.instance_list_per_project_cells = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] api.list_records_by_skipping_down_cells = True {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] api.local_metadata_per_cell = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] api.max_limit = 1000 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] api.metadata_cache_expiration = 15 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] api.neutron_default_tenant_id = default {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] api.use_forwarded_for = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] api.use_neutron_default_nets = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] api.vendordata_dynamic_failure_fatal = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] api.vendordata_dynamic_ssl_certfile = {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] api.vendordata_dynamic_targets = [] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] api.vendordata_jsonfile_path = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] api.vendordata_providers = ['StaticJSON'] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cache.backend = dogpile.cache.memcached {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cache.backend_argument = **** {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cache.config_prefix = cache.oslo {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cache.dead_timeout = 60.0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cache.debug_cache_backend = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cache.enable_retry_client = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cache.enable_socket_keepalive = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cache.enabled = True {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cache.expiration_time = 600 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cache.hashclient_retry_attempts = 2 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cache.hashclient_retry_delay = 1.0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cache.memcache_dead_retry = 300 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cache.memcache_password = {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cache.memcache_pool_maxsize = 10 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cache.memcache_pool_unused_timeout = 60 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cache.memcache_sasl_enabled = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cache.memcache_servers = ['localhost:11211'] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cache.memcache_socket_timeout = 1.0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cache.memcache_username = {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cache.proxies = [] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cache.retry_attempts = 2 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cache.retry_delay = 0.0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cache.socket_keepalive_count = 1 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cache.socket_keepalive_idle = 1 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cache.socket_keepalive_interval = 1 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cache.tls_allowed_ciphers = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cache.tls_cafile = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cache.tls_certfile = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cache.tls_enabled = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cache.tls_keyfile = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cinder.auth_section = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cinder.auth_type = password {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cinder.cafile = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cinder.catalog_info = volumev3::publicURL {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cinder.certfile = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cinder.collect_timing = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cinder.cross_az_attach = True {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cinder.debug = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cinder.endpoint_template = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cinder.http_retries = 3 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cinder.insecure = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cinder.keyfile = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cinder.os_region_name = RegionOne {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cinder.split_loggers = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cinder.timeout = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] compute.cpu_dedicated_set = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] compute.cpu_shared_set = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] compute.image_type_exclude_list = [] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] compute.live_migration_wait_for_vif_plug = True {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] compute.max_concurrent_disk_ops = 0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] compute.max_disk_devices_to_attach = -1 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] compute.resource_provider_association_refresh = 300 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] compute.shutdown_retry_interval = 10 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] conductor.workers = 3 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] console.allowed_origins = [] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] console.ssl_ciphers = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] console.ssl_minimum_version = default {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] consoleauth.token_ttl = 600 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cyborg.cafile = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cyborg.certfile = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cyborg.collect_timing = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cyborg.connect_retries = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cyborg.connect_retry_delay = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cyborg.endpoint_override = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cyborg.insecure = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cyborg.keyfile = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cyborg.max_version = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cyborg.min_version = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cyborg.region_name = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cyborg.service_name = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cyborg.service_type = accelerator {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cyborg.split_loggers = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cyborg.status_code_retries = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cyborg.status_code_retry_delay = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cyborg.timeout = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] cyborg.version = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] database.backend = sqlalchemy {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] database.connection = **** {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] database.connection_debug = 0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] database.connection_parameters = {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] database.connection_recycle_time = 3600 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] database.connection_trace = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] database.db_inc_retry_interval = True {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] database.db_max_retries = 20 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] database.db_max_retry_interval = 10 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] database.db_retry_interval = 1 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] database.max_overflow = 50 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] database.max_pool_size = 5 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] database.max_retries = 10 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] database.mysql_enable_ndb = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] database.mysql_sql_mode = TRADITIONAL {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] database.mysql_wsrep_sync_wait = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] database.pool_timeout = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] database.retry_interval = 10 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] database.slave_connection = **** {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] database.sqlite_synchronous = True {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] api_database.backend = sqlalchemy {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] api_database.connection = **** {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] api_database.connection_debug = 0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] api_database.connection_parameters = {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] api_database.connection_recycle_time = 3600 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] api_database.connection_trace = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] api_database.db_inc_retry_interval = True {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] api_database.db_max_retries = 20 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] api_database.db_max_retry_interval = 10 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] api_database.db_retry_interval = 1 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] api_database.max_overflow = 50 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] api_database.max_pool_size = 5 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] api_database.max_retries = 10 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] api_database.mysql_enable_ndb = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] api_database.mysql_wsrep_sync_wait = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] api_database.pool_timeout = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] api_database.retry_interval = 10 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] api_database.slave_connection = **** {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] api_database.sqlite_synchronous = True {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] devices.enabled_mdev_types = [] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] ephemeral_storage_encryption.enabled = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] ephemeral_storage_encryption.key_size = 512 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] glance.api_servers = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] glance.cafile = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] glance.certfile = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] glance.collect_timing = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] glance.connect_retries = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] glance.connect_retry_delay = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] glance.debug = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] glance.default_trusted_certificate_ids = [] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] glance.enable_certificate_validation = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] glance.enable_rbd_download = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] glance.endpoint_override = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] glance.insecure = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] glance.keyfile = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] glance.max_version = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] glance.min_version = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] glance.num_retries = 3 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] glance.rbd_ceph_conf = {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] glance.rbd_connect_timeout = 5 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] glance.rbd_pool = {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] glance.rbd_user = {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] glance.region_name = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] glance.service_name = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] glance.service_type = image {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] glance.split_loggers = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] glance.status_code_retries = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] glance.status_code_retry_delay = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] glance.timeout = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] glance.verify_glance_signatures = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] glance.version = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] guestfs.debug = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] hyperv.config_drive_cdrom = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] hyperv.config_drive_inject_password = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] hyperv.enable_instance_metrics_collection = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] hyperv.enable_remotefx = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] hyperv.instances_path_share = {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] hyperv.iscsi_initiator_list = [] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] hyperv.limit_cpu_features = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] hyperv.power_state_check_timeframe = 60 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] hyperv.power_state_event_polling_interval = 2 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] hyperv.use_multipath_io = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] hyperv.volume_attach_retry_count = 10 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] hyperv.volume_attach_retry_interval = 5 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] hyperv.vswitch_name = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] mks.enabled = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] image_cache.manager_interval = 2400 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] image_cache.precache_concurrency = 1 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] image_cache.remove_unused_base_images = True {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] image_cache.subdirectory_name = _base {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] ironic.api_max_retries = 60 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] ironic.api_retry_interval = 2 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] ironic.auth_section = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] ironic.auth_type = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] ironic.cafile = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] ironic.certfile = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] ironic.collect_timing = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] ironic.connect_retries = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] ironic.connect_retry_delay = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] ironic.endpoint_override = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] ironic.insecure = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] ironic.keyfile = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] ironic.max_version = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] ironic.min_version = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] ironic.partition_key = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] ironic.peer_list = [] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] ironic.region_name = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] ironic.serial_console_state_timeout = 10 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] ironic.service_name = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] ironic.service_type = baremetal {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] ironic.split_loggers = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] ironic.status_code_retries = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] ironic.status_code_retry_delay = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] ironic.timeout = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] ironic.version = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] key_manager.fixed_key = **** {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] barbican.barbican_api_version = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] barbican.barbican_endpoint = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] barbican.barbican_endpoint_type = public {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] barbican.barbican_region_name = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] barbican.cafile = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] barbican.certfile = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] barbican.collect_timing = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] barbican.insecure = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] barbican.keyfile = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] barbican.number_of_retries = 60 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] barbican.retry_delay = 1 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] barbican.send_service_user_token = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] barbican.split_loggers = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] barbican.timeout = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] barbican.verify_ssl = True {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] barbican.verify_ssl_path = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] barbican_service_user.auth_section = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] barbican_service_user.auth_type = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] barbican_service_user.cafile = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] barbican_service_user.certfile = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] barbican_service_user.collect_timing = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] barbican_service_user.insecure = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] barbican_service_user.keyfile = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] barbican_service_user.split_loggers = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] barbican_service_user.timeout = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vault.approle_role_id = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vault.approle_secret_id = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vault.cafile = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vault.certfile = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vault.collect_timing = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vault.insecure = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vault.keyfile = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vault.kv_mountpoint = secret {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vault.kv_version = 2 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vault.namespace = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vault.root_token_id = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vault.split_loggers = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vault.ssl_ca_crt_file = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vault.timeout = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vault.use_ssl = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] keystone.cafile = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] keystone.certfile = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] keystone.collect_timing = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] keystone.connect_retries = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] keystone.connect_retry_delay = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] keystone.endpoint_override = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] keystone.insecure = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] keystone.keyfile = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] keystone.max_version = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] keystone.min_version = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] keystone.region_name = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] keystone.service_name = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] keystone.service_type = identity {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] keystone.split_loggers = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] keystone.status_code_retries = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] keystone.status_code_retry_delay = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] keystone.timeout = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] keystone.version = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.connection_uri = {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.cpu_mode = custom {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.cpu_model_extra_flags = [] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: WARNING oslo_config.cfg [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] Deprecated: Option "cpu_model" from group "libvirt" is deprecated. Use option "cpu_models" from group "libvirt". Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.cpu_models = ['Nehalem'] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.cpu_power_governor_high = performance {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.cpu_power_governor_low = powersave {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.cpu_power_management = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.device_detach_attempts = 8 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.device_detach_timeout = 20 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.disk_cachemodes = [] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.disk_prefix = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.enabled_perf_events = [] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.file_backed_memory = 0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.gid_maps = [] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.hw_disk_discard = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.hw_machine_type = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.images_rbd_ceph_conf = {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.images_rbd_glance_store_name = {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.images_rbd_pool = rbd {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.images_type = default {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.images_volume_group = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.inject_key = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.inject_partition = -2 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.inject_password = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.iscsi_iface = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.iser_use_multipath = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.live_migration_bandwidth = 0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.live_migration_completion_timeout = 800 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.live_migration_downtime = 500 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.live_migration_downtime_delay = 75 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.live_migration_downtime_steps = 10 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.live_migration_inbound_addr = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.live_migration_permit_auto_converge = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.live_migration_permit_post_copy = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.live_migration_scheme = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.live_migration_timeout_action = abort {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.live_migration_tunnelled = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: WARNING oslo_config.cfg [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal ( Apr 21 10:42:57 user nova-compute[70954]: live_migration_uri is deprecated for removal in favor of two other options that Apr 21 10:42:57 user nova-compute[70954]: allow to change live migration scheme and target URI: ``live_migration_scheme`` Apr 21 10:42:57 user nova-compute[70954]: and ``live_migration_inbound_addr`` respectively. Apr 21 10:42:57 user nova-compute[70954]: ). Its value may be silently ignored in the future. Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.live_migration_uri = qemu+ssh://stack@%s/system {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.live_migration_with_native_tls = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.max_queues = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.mem_stats_period_seconds = 10 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.nfs_mount_options = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.nfs_mount_point_base = /opt/stack/data/nova/mnt {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.num_aoe_discover_tries = 3 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.num_iser_scan_tries = 5 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.num_memory_encrypted_guests = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.num_nvme_discover_tries = 5 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.num_pcie_ports = 0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.num_volume_scan_tries = 5 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.pmem_namespaces = [] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.quobyte_client_cfg = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.quobyte_mount_point_base = /opt/stack/data/nova/mnt {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.rbd_connect_timeout = 5 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.rbd_secret_uuid = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.rbd_user = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.realtime_scheduler_priority = 1 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.remote_filesystem_transport = ssh {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.rescue_image_id = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.rescue_kernel_id = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.rescue_ramdisk_id = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.rng_dev_path = /dev/urandom {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.rx_queue_size = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.smbfs_mount_options = {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.smbfs_mount_point_base = /opt/stack/data/nova/mnt {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.snapshot_compression = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.snapshot_image_format = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.sparse_logical_volumes = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.swtpm_enabled = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.swtpm_group = tss {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.swtpm_user = tss {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.sysinfo_serial = unique {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.tx_queue_size = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.uid_maps = [] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.use_virtio_for_bridges = True {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.virt_type = kvm {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.volume_clear = zero {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.volume_clear_size = 0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.volume_use_multipath = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.vzstorage_cache_path = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.vzstorage_mount_group = qemu {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.vzstorage_mount_opts = [] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/nova/mnt {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.vzstorage_mount_user = stack {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] neutron.auth_section = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] neutron.auth_type = password {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] neutron.cafile = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] neutron.certfile = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] neutron.collect_timing = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] neutron.connect_retries = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] neutron.connect_retry_delay = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] neutron.default_floating_pool = public {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] neutron.endpoint_override = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] neutron.extension_sync_interval = 600 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] neutron.http_retries = 3 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] neutron.insecure = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] neutron.keyfile = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] neutron.max_version = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] neutron.metadata_proxy_shared_secret = **** {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] neutron.min_version = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] neutron.ovs_bridge = br-int {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] neutron.physnets = [] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] neutron.region_name = RegionOne {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] neutron.service_metadata_proxy = True {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] neutron.service_name = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] neutron.service_type = network {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] neutron.split_loggers = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] neutron.status_code_retries = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] neutron.status_code_retry_delay = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] neutron.timeout = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] neutron.version = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] notifications.bdms_in_notifications = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] notifications.default_level = INFO {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] notifications.notification_format = unversioned {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] notifications.notify_on_state_change = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] pci.alias = [] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] pci.device_spec = [] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] pci.report_in_placement = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] placement.auth_section = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] placement.auth_type = password {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] placement.auth_url = http://10.0.0.210/identity {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] placement.cafile = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] placement.certfile = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] placement.collect_timing = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] placement.connect_retries = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] placement.connect_retry_delay = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] placement.default_domain_id = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] placement.default_domain_name = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] placement.domain_id = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] placement.domain_name = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] placement.endpoint_override = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] placement.insecure = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] placement.keyfile = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] placement.max_version = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] placement.min_version = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] placement.password = **** {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] placement.project_domain_id = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] placement.project_domain_name = Default {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] placement.project_id = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] placement.project_name = service {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] placement.region_name = RegionOne {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] placement.service_name = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] placement.service_type = placement {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] placement.split_loggers = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] placement.status_code_retries = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] placement.status_code_retry_delay = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] placement.system_scope = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] placement.timeout = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] placement.trust_id = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] placement.user_domain_id = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] placement.user_domain_name = Default {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] placement.user_id = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] placement.username = placement {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] placement.version = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] quota.cores = 20 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] quota.count_usage_from_placement = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] quota.injected_file_content_bytes = 10240 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] quota.injected_file_path_length = 255 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] quota.injected_files = 5 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] quota.instances = 10 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] quota.key_pairs = 100 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] quota.metadata_items = 128 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] quota.ram = 51200 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] quota.recheck_quota = True {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] quota.server_group_members = 10 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] quota.server_groups = 10 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] rdp.enabled = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] scheduler.image_metadata_prefilter = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] scheduler.max_attempts = 3 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] scheduler.max_placement_results = 1000 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] scheduler.query_placement_for_availability_zone = True {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] scheduler.query_placement_for_image_type_support = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] scheduler.workers = 3 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] filter_scheduler.enabled_filters = ['AvailabilityZoneFilter', 'ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] filter_scheduler.host_subset_size = 1 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] filter_scheduler.image_properties_default_architecture = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] filter_scheduler.isolated_hosts = [] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] filter_scheduler.isolated_images = [] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] filter_scheduler.max_instances_per_host = 50 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] filter_scheduler.pci_in_placement = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] filter_scheduler.track_instance_changes = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] metrics.required = True {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] metrics.weight_multiplier = 1.0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] metrics.weight_of_unavailable = -10000.0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] metrics.weight_setting = [] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] serial_console.enabled = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] serial_console.port_range = 10000:20000 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] serial_console.serialproxy_port = 6083 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] service_user.auth_section = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] service_user.auth_type = password {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] service_user.cafile = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] service_user.certfile = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] service_user.collect_timing = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] service_user.insecure = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] service_user.keyfile = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] service_user.send_service_user_token = True {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] service_user.split_loggers = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] service_user.timeout = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] spice.agent_enabled = True {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] spice.enabled = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] spice.html5proxy_base_url = http://10.0.0.210:6081/spice_auto.html {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] spice.html5proxy_host = 0.0.0.0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] spice.html5proxy_port = 6082 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] spice.image_compression = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] spice.jpeg_compression = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] spice.playback_compression = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] spice.server_listen = 127.0.0.1 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] spice.streaming_mode = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] spice.zlib_compression = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] upgrade_levels.baseapi = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] upgrade_levels.cert = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] upgrade_levels.compute = auto {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] upgrade_levels.conductor = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] upgrade_levels.scheduler = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vendordata_dynamic_auth.auth_section = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vendordata_dynamic_auth.auth_type = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vendordata_dynamic_auth.cafile = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vendordata_dynamic_auth.certfile = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vendordata_dynamic_auth.collect_timing = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vendordata_dynamic_auth.insecure = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vendordata_dynamic_auth.keyfile = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vendordata_dynamic_auth.split_loggers = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vendordata_dynamic_auth.timeout = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vmware.api_retry_count = 10 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vmware.ca_file = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vmware.cache_prefix = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vmware.cluster_name = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vmware.connection_pool_size = 10 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vmware.console_delay_seconds = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vmware.datastore_regex = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vmware.host_ip = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vmware.host_password = **** {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vmware.host_port = 443 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vmware.host_username = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vmware.insecure = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vmware.integration_bridge = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vmware.maximum_objects = 100 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vmware.pbm_default_policy = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vmware.pbm_enabled = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vmware.pbm_wsdl_location = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vmware.serial_port_proxy_uri = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vmware.serial_port_service_uri = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vmware.task_poll_interval = 0.5 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vmware.use_linked_clone = True {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vmware.vnc_keymap = en-us {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vmware.vnc_port = 5900 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vmware.vnc_port_total = 10000 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vnc.auth_schemes = ['none'] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vnc.enabled = True {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vnc.novncproxy_base_url = http://10.0.0.210:6080/vnc_lite.html {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vnc.novncproxy_port = 6080 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vnc.server_listen = 0.0.0.0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vnc.server_proxyclient_address = 10.0.0.210 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vnc.vencrypt_ca_certs = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vnc.vencrypt_client_cert = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vnc.vencrypt_client_key = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] workarounds.disable_fallback_pcpu_query = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] workarounds.disable_group_policy_check_upcall = True {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] workarounds.disable_rootwrap = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] workarounds.enable_numa_live_migration = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] workarounds.handle_virt_lifecycle_events = True {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] workarounds.libvirt_disable_apic = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] workarounds.never_download_image_if_on_rbd = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] wsgi.client_socket_timeout = 900 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] wsgi.default_pool_size = 1000 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] wsgi.keep_alive = True {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] wsgi.max_header_line = 16384 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] wsgi.secure_proxy_ssl_header = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] wsgi.ssl_ca_file = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] wsgi.ssl_cert_file = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] wsgi.ssl_key_file = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] wsgi.tcp_keepidle = 600 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] zvm.ca_file = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] zvm.cloud_connector_url = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] zvm.image_tmp_path = /opt/stack/data/nova/images {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] zvm.reachable_timeout = 300 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_policy.enforce_new_defaults = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_policy.enforce_scope = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_policy.policy_default_rule = default {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_policy.policy_file = policy.yaml {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] profiler.connection_string = messaging:// {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] profiler.enabled = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] profiler.es_doc_type = notification {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] profiler.es_scroll_size = 10000 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] profiler.es_scroll_time = 2m {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] profiler.filter_error_trace = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] profiler.hmac_keys = SECRET_KEY {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] profiler.sentinel_service_name = mymaster {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] profiler.socket_timeout = 0.1 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] profiler.trace_sqlalchemy = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] remote_debug.host = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] remote_debug.port = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_messaging_rabbit.rabbit_quroum_max_memory_bytes = 0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_messaging_rabbit.rabbit_quroum_max_memory_length = 0 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_messaging_rabbit.ssl = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_messaging_rabbit.ssl_version = {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_messaging_notifications.retry = -1 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_messaging_notifications.transport_url = **** {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_limit.auth_section = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_limit.auth_type = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_limit.cafile = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_limit.certfile = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_limit.collect_timing = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_limit.connect_retries = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_limit.connect_retry_delay = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_limit.endpoint_id = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_limit.endpoint_override = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_limit.insecure = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_limit.keyfile = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_limit.max_version = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_limit.min_version = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_limit.region_name = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_limit.service_name = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_limit.service_type = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_limit.split_loggers = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_limit.status_code_retries = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_limit.status_code_retry_delay = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_limit.timeout = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_limit.valid_interfaces = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_limit.version = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_reports.file_event_handler = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_reports.file_event_handler_interval = 1 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] oslo_reports.log_dir = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vif_plug_linux_bridge_privileged.group = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vif_plug_linux_bridge_privileged.thread_pool_size = 12 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vif_plug_linux_bridge_privileged.user = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vif_plug_ovs_privileged.group = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vif_plug_ovs_privileged.helper_command = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vif_plug_ovs_privileged.thread_pool_size = 12 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] vif_plug_ovs_privileged.user = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] os_vif_linux_bridge.flat_interface = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] os_vif_linux_bridge.vlan_interface = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] os_vif_ovs.isolate_vif = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] os_vif_ovs.ovsdb_interface = native {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] os_vif_ovs.per_port_bridge = False {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] os_brick.lock_path = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] privsep_osbrick.capabilities = [21] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] privsep_osbrick.group = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] privsep_osbrick.helper_command = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] privsep_osbrick.thread_pool_size = 12 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] privsep_osbrick.user = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] nova_sys_admin.group = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] nova_sys_admin.helper_command = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] nova_sys_admin.thread_pool_size = 12 {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] nova_sys_admin.user = None {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG oslo_service.service [None req-0dae9362-3cfc-44c1-bba6-216d2ec3778c None None] ******************************************************************************** {{(pid=70954) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2613}} Apr 21 10:42:57 user nova-compute[70954]: INFO nova.service [-] Starting compute node (version 0.0.0) Apr 21 10:42:57 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Starting native event thread {{(pid=70954) _init_events /opt/stack/nova/nova/virt/libvirt/host.py:492}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Starting green dispatch thread {{(pid=70954) _init_events /opt/stack/nova/nova/virt/libvirt/host.py:498}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Starting connection event dispatch thread {{(pid=70954) initialize /opt/stack/nova/nova/virt/libvirt/host.py:620}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Connecting to libvirt: qemu:///system {{(pid=70954) _get_new_connection /opt/stack/nova/nova/virt/libvirt/host.py:503}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Registering for lifecycle events {{(pid=70954) _get_new_connection /opt/stack/nova/nova/virt/libvirt/host.py:509}} Apr 21 10:42:57 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Registering for connection events: {{(pid=70954) _get_new_connection /opt/stack/nova/nova/virt/libvirt/host.py:530}} Apr 21 10:42:57 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Connection event '1' reason 'None' Apr 21 10:42:57 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Cannot update service status on host "user" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host user could not be found. Apr 21 10:42:57 user nova-compute[70954]: DEBUG nova.virt.libvirt.volume.mount [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Initialising _HostMountState generation 0 {{(pid=70954) host_up /opt/stack/nova/nova/virt/libvirt/volume/mount.py:130}} Apr 21 10:43:04 user nova-compute[70954]: INFO nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Libvirt host capabilities Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: e20c3142-5af9-7467-ecd8-70b2e4a210d6 Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: x86_64 Apr 21 10:43:04 user nova-compute[70954]: IvyBridge-IBRS Apr 21 10:43:04 user nova-compute[70954]: Intel Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: tcp Apr 21 10:43:04 user nova-compute[70954]: rdma Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: 8189224 Apr 21 10:43:04 user nova-compute[70954]: 2047306 Apr 21 10:43:04 user nova-compute[70954]: 0 Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: 8218764 Apr 21 10:43:04 user nova-compute[70954]: 2054691 Apr 21 10:43:04 user nova-compute[70954]: 0 Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: apparmor Apr 21 10:43:04 user nova-compute[70954]: 0 Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: dac Apr 21 10:43:04 user nova-compute[70954]: 0 Apr 21 10:43:04 user nova-compute[70954]: +64055:+108 Apr 21 10:43:04 user nova-compute[70954]: +64055:+108 Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: hvm Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: 64 Apr 21 10:43:04 user nova-compute[70954]: /usr/bin/qemu-system-alpha Apr 21 10:43:04 user nova-compute[70954]: clipper Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: hvm Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: 32 Apr 21 10:43:04 user nova-compute[70954]: /usr/bin/qemu-system-arm Apr 21 10:43:04 user nova-compute[70954]: integratorcp Apr 21 10:43:04 user nova-compute[70954]: ast2600-evb Apr 21 10:43:04 user nova-compute[70954]: borzoi Apr 21 10:43:04 user nova-compute[70954]: spitz Apr 21 10:43:04 user nova-compute[70954]: virt-2.7 Apr 21 10:43:04 user nova-compute[70954]: nuri Apr 21 10:43:04 user nova-compute[70954]: mcimx7d-sabre Apr 21 10:43:04 user nova-compute[70954]: romulus-bmc Apr 21 10:43:04 user nova-compute[70954]: virt-3.0 Apr 21 10:43:04 user nova-compute[70954]: virt-5.0 Apr 21 10:43:04 user nova-compute[70954]: npcm750-evb Apr 21 10:43:04 user nova-compute[70954]: virt-2.10 Apr 21 10:43:04 user nova-compute[70954]: rainier-bmc Apr 21 10:43:04 user nova-compute[70954]: mps3-an547 Apr 21 10:43:04 user nova-compute[70954]: musca-b1 Apr 21 10:43:04 user nova-compute[70954]: realview-pbx-a9 Apr 21 10:43:04 user nova-compute[70954]: versatileab Apr 21 10:43:04 user nova-compute[70954]: kzm Apr 21 10:43:04 user nova-compute[70954]: virt-2.8 Apr 21 10:43:04 user nova-compute[70954]: musca-a Apr 21 10:43:04 user nova-compute[70954]: virt-3.1 Apr 21 10:43:04 user nova-compute[70954]: mcimx6ul-evk Apr 21 10:43:04 user nova-compute[70954]: virt-5.1 Apr 21 10:43:04 user nova-compute[70954]: smdkc210 Apr 21 10:43:04 user nova-compute[70954]: sx1 Apr 21 10:43:04 user nova-compute[70954]: virt-2.11 Apr 21 10:43:04 user nova-compute[70954]: imx25-pdk Apr 21 10:43:04 user nova-compute[70954]: stm32vldiscovery Apr 21 10:43:04 user nova-compute[70954]: virt-2.9 Apr 21 10:43:04 user nova-compute[70954]: orangepi-pc Apr 21 10:43:04 user nova-compute[70954]: quanta-q71l-bmc Apr 21 10:43:04 user nova-compute[70954]: z2 Apr 21 10:43:04 user nova-compute[70954]: virt-5.2 Apr 21 10:43:04 user nova-compute[70954]: xilinx-zynq-a9 Apr 21 10:43:04 user nova-compute[70954]: tosa Apr 21 10:43:04 user nova-compute[70954]: mps2-an500 Apr 21 10:43:04 user nova-compute[70954]: virt-2.12 Apr 21 10:43:04 user nova-compute[70954]: mps2-an521 Apr 21 10:43:04 user nova-compute[70954]: sabrelite Apr 21 10:43:04 user nova-compute[70954]: mps2-an511 Apr 21 10:43:04 user nova-compute[70954]: canon-a1100 Apr 21 10:43:04 user nova-compute[70954]: realview-eb Apr 21 10:43:04 user nova-compute[70954]: quanta-gbs-bmc Apr 21 10:43:04 user nova-compute[70954]: emcraft-sf2 Apr 21 10:43:04 user nova-compute[70954]: realview-pb-a8 Apr 21 10:43:04 user nova-compute[70954]: virt-4.0 Apr 21 10:43:04 user nova-compute[70954]: raspi1ap Apr 21 10:43:04 user nova-compute[70954]: palmetto-bmc Apr 21 10:43:04 user nova-compute[70954]: sx1-v1 Apr 21 10:43:04 user nova-compute[70954]: n810 Apr 21 10:43:04 user nova-compute[70954]: g220a-bmc Apr 21 10:43:04 user nova-compute[70954]: n800 Apr 21 10:43:04 user nova-compute[70954]: tacoma-bmc Apr 21 10:43:04 user nova-compute[70954]: virt-4.1 Apr 21 10:43:04 user nova-compute[70954]: quanta-gsj Apr 21 10:43:04 user nova-compute[70954]: versatilepb Apr 21 10:43:04 user nova-compute[70954]: terrier Apr 21 10:43:04 user nova-compute[70954]: mainstone Apr 21 10:43:04 user nova-compute[70954]: realview-eb-mpcore Apr 21 10:43:04 user nova-compute[70954]: supermicrox11-bmc Apr 21 10:43:04 user nova-compute[70954]: virt-4.2 Apr 21 10:43:04 user nova-compute[70954]: witherspoon-bmc Apr 21 10:43:04 user nova-compute[70954]: mps3-an524 Apr 21 10:43:04 user nova-compute[70954]: swift-bmc Apr 21 10:43:04 user nova-compute[70954]: kudo-bmc Apr 21 10:43:04 user nova-compute[70954]: vexpress-a9 Apr 21 10:43:04 user nova-compute[70954]: midway Apr 21 10:43:04 user nova-compute[70954]: musicpal Apr 21 10:43:04 user nova-compute[70954]: lm3s811evb Apr 21 10:43:04 user nova-compute[70954]: lm3s6965evb Apr 21 10:43:04 user nova-compute[70954]: microbit Apr 21 10:43:04 user nova-compute[70954]: mps2-an505 Apr 21 10:43:04 user nova-compute[70954]: mps2-an385 Apr 21 10:43:04 user nova-compute[70954]: virt-6.0 Apr 21 10:43:04 user nova-compute[70954]: cubieboard Apr 21 10:43:04 user nova-compute[70954]: verdex Apr 21 10:43:04 user nova-compute[70954]: netduino2 Apr 21 10:43:04 user nova-compute[70954]: mps2-an386 Apr 21 10:43:04 user nova-compute[70954]: virt-6.1 Apr 21 10:43:04 user nova-compute[70954]: raspi2b Apr 21 10:43:04 user nova-compute[70954]: vexpress-a15 Apr 21 10:43:04 user nova-compute[70954]: fuji-bmc Apr 21 10:43:04 user nova-compute[70954]: virt-6.2 Apr 21 10:43:04 user nova-compute[70954]: virt Apr 21 10:43:04 user nova-compute[70954]: sonorapass-bmc Apr 21 10:43:04 user nova-compute[70954]: cheetah Apr 21 10:43:04 user nova-compute[70954]: virt-2.6 Apr 21 10:43:04 user nova-compute[70954]: ast2500-evb Apr 21 10:43:04 user nova-compute[70954]: highbank Apr 21 10:43:04 user nova-compute[70954]: akita Apr 21 10:43:04 user nova-compute[70954]: connex Apr 21 10:43:04 user nova-compute[70954]: netduinoplus2 Apr 21 10:43:04 user nova-compute[70954]: collie Apr 21 10:43:04 user nova-compute[70954]: raspi0 Apr 21 10:43:04 user nova-compute[70954]: fp5280g2-bmc Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: hvm Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: 32 Apr 21 10:43:04 user nova-compute[70954]: /usr/bin/qemu-system-arm Apr 21 10:43:04 user nova-compute[70954]: integratorcp Apr 21 10:43:04 user nova-compute[70954]: ast2600-evb Apr 21 10:43:04 user nova-compute[70954]: borzoi Apr 21 10:43:04 user nova-compute[70954]: spitz Apr 21 10:43:04 user nova-compute[70954]: virt-2.7 Apr 21 10:43:04 user nova-compute[70954]: nuri Apr 21 10:43:04 user nova-compute[70954]: mcimx7d-sabre Apr 21 10:43:04 user nova-compute[70954]: romulus-bmc Apr 21 10:43:04 user nova-compute[70954]: virt-3.0 Apr 21 10:43:04 user nova-compute[70954]: virt-5.0 Apr 21 10:43:04 user nova-compute[70954]: npcm750-evb Apr 21 10:43:04 user nova-compute[70954]: virt-2.10 Apr 21 10:43:04 user nova-compute[70954]: rainier-bmc Apr 21 10:43:04 user nova-compute[70954]: mps3-an547 Apr 21 10:43:04 user nova-compute[70954]: musca-b1 Apr 21 10:43:04 user nova-compute[70954]: realview-pbx-a9 Apr 21 10:43:04 user nova-compute[70954]: versatileab Apr 21 10:43:04 user nova-compute[70954]: kzm Apr 21 10:43:04 user nova-compute[70954]: virt-2.8 Apr 21 10:43:04 user nova-compute[70954]: musca-a Apr 21 10:43:04 user nova-compute[70954]: virt-3.1 Apr 21 10:43:04 user nova-compute[70954]: mcimx6ul-evk Apr 21 10:43:04 user nova-compute[70954]: virt-5.1 Apr 21 10:43:04 user nova-compute[70954]: smdkc210 Apr 21 10:43:04 user nova-compute[70954]: sx1 Apr 21 10:43:04 user nova-compute[70954]: virt-2.11 Apr 21 10:43:04 user nova-compute[70954]: imx25-pdk Apr 21 10:43:04 user nova-compute[70954]: stm32vldiscovery Apr 21 10:43:04 user nova-compute[70954]: virt-2.9 Apr 21 10:43:04 user nova-compute[70954]: orangepi-pc Apr 21 10:43:04 user nova-compute[70954]: quanta-q71l-bmc Apr 21 10:43:04 user nova-compute[70954]: z2 Apr 21 10:43:04 user nova-compute[70954]: virt-5.2 Apr 21 10:43:04 user nova-compute[70954]: xilinx-zynq-a9 Apr 21 10:43:04 user nova-compute[70954]: tosa Apr 21 10:43:04 user nova-compute[70954]: mps2-an500 Apr 21 10:43:04 user nova-compute[70954]: virt-2.12 Apr 21 10:43:04 user nova-compute[70954]: mps2-an521 Apr 21 10:43:04 user nova-compute[70954]: sabrelite Apr 21 10:43:04 user nova-compute[70954]: mps2-an511 Apr 21 10:43:04 user nova-compute[70954]: canon-a1100 Apr 21 10:43:04 user nova-compute[70954]: realview-eb Apr 21 10:43:04 user nova-compute[70954]: quanta-gbs-bmc Apr 21 10:43:04 user nova-compute[70954]: emcraft-sf2 Apr 21 10:43:04 user nova-compute[70954]: realview-pb-a8 Apr 21 10:43:04 user nova-compute[70954]: virt-4.0 Apr 21 10:43:04 user nova-compute[70954]: raspi1ap Apr 21 10:43:04 user nova-compute[70954]: palmetto-bmc Apr 21 10:43:04 user nova-compute[70954]: sx1-v1 Apr 21 10:43:04 user nova-compute[70954]: n810 Apr 21 10:43:04 user nova-compute[70954]: g220a-bmc Apr 21 10:43:04 user nova-compute[70954]: n800 Apr 21 10:43:04 user nova-compute[70954]: tacoma-bmc Apr 21 10:43:04 user nova-compute[70954]: virt-4.1 Apr 21 10:43:04 user nova-compute[70954]: quanta-gsj Apr 21 10:43:04 user nova-compute[70954]: versatilepb Apr 21 10:43:04 user nova-compute[70954]: terrier Apr 21 10:43:04 user nova-compute[70954]: mainstone Apr 21 10:43:04 user nova-compute[70954]: realview-eb-mpcore Apr 21 10:43:04 user nova-compute[70954]: supermicrox11-bmc Apr 21 10:43:04 user nova-compute[70954]: virt-4.2 Apr 21 10:43:04 user nova-compute[70954]: witherspoon-bmc Apr 21 10:43:04 user nova-compute[70954]: mps3-an524 Apr 21 10:43:04 user nova-compute[70954]: swift-bmc Apr 21 10:43:04 user nova-compute[70954]: kudo-bmc Apr 21 10:43:04 user nova-compute[70954]: vexpress-a9 Apr 21 10:43:04 user nova-compute[70954]: midway Apr 21 10:43:04 user nova-compute[70954]: musicpal Apr 21 10:43:04 user nova-compute[70954]: lm3s811evb Apr 21 10:43:04 user nova-compute[70954]: lm3s6965evb Apr 21 10:43:04 user nova-compute[70954]: microbit Apr 21 10:43:04 user nova-compute[70954]: mps2-an505 Apr 21 10:43:04 user nova-compute[70954]: mps2-an385 Apr 21 10:43:04 user nova-compute[70954]: virt-6.0 Apr 21 10:43:04 user nova-compute[70954]: cubieboard Apr 21 10:43:04 user nova-compute[70954]: verdex Apr 21 10:43:04 user nova-compute[70954]: netduino2 Apr 21 10:43:04 user nova-compute[70954]: mps2-an386 Apr 21 10:43:04 user nova-compute[70954]: virt-6.1 Apr 21 10:43:04 user nova-compute[70954]: raspi2b Apr 21 10:43:04 user nova-compute[70954]: vexpress-a15 Apr 21 10:43:04 user nova-compute[70954]: fuji-bmc Apr 21 10:43:04 user nova-compute[70954]: virt-6.2 Apr 21 10:43:04 user nova-compute[70954]: virt Apr 21 10:43:04 user nova-compute[70954]: sonorapass-bmc Apr 21 10:43:04 user nova-compute[70954]: cheetah Apr 21 10:43:04 user nova-compute[70954]: virt-2.6 Apr 21 10:43:04 user nova-compute[70954]: ast2500-evb Apr 21 10:43:04 user nova-compute[70954]: highbank Apr 21 10:43:04 user nova-compute[70954]: akita Apr 21 10:43:04 user nova-compute[70954]: connex Apr 21 10:43:04 user nova-compute[70954]: netduinoplus2 Apr 21 10:43:04 user nova-compute[70954]: collie Apr 21 10:43:04 user nova-compute[70954]: raspi0 Apr 21 10:43:04 user nova-compute[70954]: fp5280g2-bmc Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: hvm Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: 64 Apr 21 10:43:04 user nova-compute[70954]: /usr/bin/qemu-system-aarch64 Apr 21 10:43:04 user nova-compute[70954]: integratorcp Apr 21 10:43:04 user nova-compute[70954]: ast2600-evb Apr 21 10:43:04 user nova-compute[70954]: borzoi Apr 21 10:43:04 user nova-compute[70954]: spitz Apr 21 10:43:04 user nova-compute[70954]: virt-2.7 Apr 21 10:43:04 user nova-compute[70954]: nuri Apr 21 10:43:04 user nova-compute[70954]: mcimx7d-sabre Apr 21 10:43:04 user nova-compute[70954]: romulus-bmc Apr 21 10:43:04 user nova-compute[70954]: virt-3.0 Apr 21 10:43:04 user nova-compute[70954]: virt-5.0 Apr 21 10:43:04 user nova-compute[70954]: npcm750-evb Apr 21 10:43:04 user nova-compute[70954]: virt-2.10 Apr 21 10:43:04 user nova-compute[70954]: rainier-bmc Apr 21 10:43:04 user nova-compute[70954]: mps3-an547 Apr 21 10:43:04 user nova-compute[70954]: virt-2.8 Apr 21 10:43:04 user nova-compute[70954]: musca-b1 Apr 21 10:43:04 user nova-compute[70954]: realview-pbx-a9 Apr 21 10:43:04 user nova-compute[70954]: versatileab Apr 21 10:43:04 user nova-compute[70954]: kzm Apr 21 10:43:04 user nova-compute[70954]: musca-a Apr 21 10:43:04 user nova-compute[70954]: virt-3.1 Apr 21 10:43:04 user nova-compute[70954]: mcimx6ul-evk Apr 21 10:43:04 user nova-compute[70954]: virt-5.1 Apr 21 10:43:04 user nova-compute[70954]: smdkc210 Apr 21 10:43:04 user nova-compute[70954]: sx1 Apr 21 10:43:04 user nova-compute[70954]: virt-2.11 Apr 21 10:43:04 user nova-compute[70954]: imx25-pdk Apr 21 10:43:04 user nova-compute[70954]: stm32vldiscovery Apr 21 10:43:04 user nova-compute[70954]: virt-2.9 Apr 21 10:43:04 user nova-compute[70954]: orangepi-pc Apr 21 10:43:04 user nova-compute[70954]: quanta-q71l-bmc Apr 21 10:43:04 user nova-compute[70954]: z2 Apr 21 10:43:04 user nova-compute[70954]: virt-5.2 Apr 21 10:43:04 user nova-compute[70954]: xilinx-zynq-a9 Apr 21 10:43:04 user nova-compute[70954]: xlnx-zcu102 Apr 21 10:43:04 user nova-compute[70954]: tosa Apr 21 10:43:04 user nova-compute[70954]: mps2-an500 Apr 21 10:43:04 user nova-compute[70954]: virt-2.12 Apr 21 10:43:04 user nova-compute[70954]: mps2-an521 Apr 21 10:43:04 user nova-compute[70954]: sabrelite Apr 21 10:43:04 user nova-compute[70954]: mps2-an511 Apr 21 10:43:04 user nova-compute[70954]: canon-a1100 Apr 21 10:43:04 user nova-compute[70954]: realview-eb Apr 21 10:43:04 user nova-compute[70954]: quanta-gbs-bmc Apr 21 10:43:04 user nova-compute[70954]: emcraft-sf2 Apr 21 10:43:04 user nova-compute[70954]: realview-pb-a8 Apr 21 10:43:04 user nova-compute[70954]: sbsa-ref Apr 21 10:43:04 user nova-compute[70954]: virt-4.0 Apr 21 10:43:04 user nova-compute[70954]: raspi1ap Apr 21 10:43:04 user nova-compute[70954]: palmetto-bmc Apr 21 10:43:04 user nova-compute[70954]: sx1-v1 Apr 21 10:43:04 user nova-compute[70954]: n810 Apr 21 10:43:04 user nova-compute[70954]: g220a-bmc Apr 21 10:43:04 user nova-compute[70954]: n800 Apr 21 10:43:04 user nova-compute[70954]: tacoma-bmc Apr 21 10:43:04 user nova-compute[70954]: virt-4.1 Apr 21 10:43:04 user nova-compute[70954]: quanta-gsj Apr 21 10:43:04 user nova-compute[70954]: versatilepb Apr 21 10:43:04 user nova-compute[70954]: terrier Apr 21 10:43:04 user nova-compute[70954]: mainstone Apr 21 10:43:04 user nova-compute[70954]: realview-eb-mpcore Apr 21 10:43:04 user nova-compute[70954]: supermicrox11-bmc Apr 21 10:43:04 user nova-compute[70954]: virt-4.2 Apr 21 10:43:04 user nova-compute[70954]: witherspoon-bmc Apr 21 10:43:04 user nova-compute[70954]: mps3-an524 Apr 21 10:43:04 user nova-compute[70954]: swift-bmc Apr 21 10:43:04 user nova-compute[70954]: kudo-bmc Apr 21 10:43:04 user nova-compute[70954]: vexpress-a9 Apr 21 10:43:04 user nova-compute[70954]: midway Apr 21 10:43:04 user nova-compute[70954]: musicpal Apr 21 10:43:04 user nova-compute[70954]: lm3s811evb Apr 21 10:43:04 user nova-compute[70954]: lm3s6965evb Apr 21 10:43:04 user nova-compute[70954]: microbit Apr 21 10:43:04 user nova-compute[70954]: mps2-an505 Apr 21 10:43:04 user nova-compute[70954]: mps2-an385 Apr 21 10:43:04 user nova-compute[70954]: virt-6.0 Apr 21 10:43:04 user nova-compute[70954]: raspi3ap Apr 21 10:43:04 user nova-compute[70954]: cubieboard Apr 21 10:43:04 user nova-compute[70954]: verdex Apr 21 10:43:04 user nova-compute[70954]: netduino2 Apr 21 10:43:04 user nova-compute[70954]: xlnx-versal-virt Apr 21 10:43:04 user nova-compute[70954]: mps2-an386 Apr 21 10:43:04 user nova-compute[70954]: virt-6.1 Apr 21 10:43:04 user nova-compute[70954]: raspi3b Apr 21 10:43:04 user nova-compute[70954]: raspi2b Apr 21 10:43:04 user nova-compute[70954]: vexpress-a15 Apr 21 10:43:04 user nova-compute[70954]: fuji-bmc Apr 21 10:43:04 user nova-compute[70954]: virt-6.2 Apr 21 10:43:04 user nova-compute[70954]: virt Apr 21 10:43:04 user nova-compute[70954]: sonorapass-bmc Apr 21 10:43:04 user nova-compute[70954]: cheetah Apr 21 10:43:04 user nova-compute[70954]: virt-2.6 Apr 21 10:43:04 user nova-compute[70954]: ast2500-evb Apr 21 10:43:04 user nova-compute[70954]: highbank Apr 21 10:43:04 user nova-compute[70954]: akita Apr 21 10:43:04 user nova-compute[70954]: connex Apr 21 10:43:04 user nova-compute[70954]: netduinoplus2 Apr 21 10:43:04 user nova-compute[70954]: collie Apr 21 10:43:04 user nova-compute[70954]: raspi0 Apr 21 10:43:04 user nova-compute[70954]: fp5280g2-bmc Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: hvm Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: 32 Apr 21 10:43:04 user nova-compute[70954]: /usr/bin/qemu-system-cris Apr 21 10:43:04 user nova-compute[70954]: axis-dev88 Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: hvm Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: 32 Apr 21 10:43:04 user nova-compute[70954]: /usr/bin/qemu-system-i386 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-jammy Apr 21 10:43:04 user nova-compute[70954]: ubuntu Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-impish-hpb Apr 21 10:43:04 user nova-compute[70954]: pc-q35-5.2 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-2.12 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-2.0 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-xenial Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-6.2 Apr 21 10:43:04 user nova-compute[70954]: pc Apr 21 10:43:04 user nova-compute[70954]: pc-q35-4.2 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-2.5 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-4.2 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-focal Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-hirsute Apr 21 10:43:04 user nova-compute[70954]: pc-q35-xenial Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-jammy-hpb Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-5.2 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-1.5 Apr 21 10:43:04 user nova-compute[70954]: pc-q35-2.7 Apr 21 10:43:04 user nova-compute[70954]: pc-q35-eoan-hpb Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-zesty Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-disco-hpb Apr 21 10:43:04 user nova-compute[70954]: pc-q35-groovy Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-groovy Apr 21 10:43:04 user nova-compute[70954]: pc-q35-artful Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-2.2 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-trusty Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-eoan-hpb Apr 21 10:43:04 user nova-compute[70954]: pc-q35-focal-hpb Apr 21 10:43:04 user nova-compute[70954]: pc-q35-bionic-hpb Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-artful Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-2.7 Apr 21 10:43:04 user nova-compute[70954]: pc-q35-6.1 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-yakkety Apr 21 10:43:04 user nova-compute[70954]: pc-q35-2.4 Apr 21 10:43:04 user nova-compute[70954]: pc-q35-cosmic-hpb Apr 21 10:43:04 user nova-compute[70954]: pc-q35-2.10 Apr 21 10:43:04 user nova-compute[70954]: x-remote Apr 21 10:43:04 user nova-compute[70954]: pc-q35-5.1 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-1.7 Apr 21 10:43:04 user nova-compute[70954]: pc-q35-2.9 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-2.11 Apr 21 10:43:04 user nova-compute[70954]: pc-q35-3.1 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-6.1 Apr 21 10:43:04 user nova-compute[70954]: pc-q35-4.1 Apr 21 10:43:04 user nova-compute[70954]: pc-q35-jammy Apr 21 10:43:04 user nova-compute[70954]: ubuntu-q35 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-2.4 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-4.1 Apr 21 10:43:04 user nova-compute[70954]: pc-q35-eoan Apr 21 10:43:04 user nova-compute[70954]: pc-q35-jammy-hpb Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-5.1 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-2.9 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-bionic-hpb Apr 21 10:43:04 user nova-compute[70954]: isapc Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-1.4 Apr 21 10:43:04 user nova-compute[70954]: pc-q35-cosmic Apr 21 10:43:04 user nova-compute[70954]: pc-q35-2.6 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-3.1 Apr 21 10:43:04 user nova-compute[70954]: pc-q35-bionic Apr 21 10:43:04 user nova-compute[70954]: pc-q35-disco-hpb Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-cosmic Apr 21 10:43:04 user nova-compute[70954]: pc-q35-2.12 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-bionic Apr 21 10:43:04 user nova-compute[70954]: pc-q35-groovy-hpb Apr 21 10:43:04 user nova-compute[70954]: pc-q35-disco Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-cosmic-hpb Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-2.1 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-wily Apr 21 10:43:04 user nova-compute[70954]: pc-q35-impish Apr 21 10:43:04 user nova-compute[70954]: pc-q35-6.0 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-impish Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-2.6 Apr 21 10:43:04 user nova-compute[70954]: pc-q35-impish-hpb Apr 21 10:43:04 user nova-compute[70954]: pc-q35-hirsute Apr 21 10:43:04 user nova-compute[70954]: pc-q35-4.0.1 Apr 21 10:43:04 user nova-compute[70954]: pc-q35-hirsute-hpb Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-1.6 Apr 21 10:43:04 user nova-compute[70954]: pc-q35-5.0 Apr 21 10:43:04 user nova-compute[70954]: pc-q35-2.8 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-2.10 Apr 21 10:43:04 user nova-compute[70954]: pc-q35-3.0 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-6.0 Apr 21 10:43:04 user nova-compute[70954]: pc-q35-zesty Apr 21 10:43:04 user nova-compute[70954]: pc-q35-4.0 Apr 21 10:43:04 user nova-compute[70954]: pc-q35-focal Apr 21 10:43:04 user nova-compute[70954]: microvm Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-2.3 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-focal-hpb Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-disco Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-4.0 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-groovy-hpb Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-hirsute-hpb Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-5.0 Apr 21 10:43:04 user nova-compute[70954]: pc-q35-6.2 Apr 21 10:43:04 user nova-compute[70954]: q35 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-2.8 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-eoan Apr 21 10:43:04 user nova-compute[70954]: pc-q35-2.5 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-3.0 Apr 21 10:43:04 user nova-compute[70954]: pc-q35-yakkety Apr 21 10:43:04 user nova-compute[70954]: pc-q35-2.11 Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: hvm Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: 32 Apr 21 10:43:04 user nova-compute[70954]: /usr/bin/qemu-system-m68k Apr 21 10:43:04 user nova-compute[70954]: mcf5208evb Apr 21 10:43:04 user nova-compute[70954]: an5206 Apr 21 10:43:04 user nova-compute[70954]: virt-6.0 Apr 21 10:43:04 user nova-compute[70954]: q800 Apr 21 10:43:04 user nova-compute[70954]: virt-6.2 Apr 21 10:43:04 user nova-compute[70954]: virt Apr 21 10:43:04 user nova-compute[70954]: next-cube Apr 21 10:43:04 user nova-compute[70954]: virt-6.1 Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: hvm Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: 32 Apr 21 10:43:04 user nova-compute[70954]: /usr/bin/qemu-system-microblaze Apr 21 10:43:04 user nova-compute[70954]: petalogix-s3adsp1800 Apr 21 10:43:04 user nova-compute[70954]: petalogix-ml605 Apr 21 10:43:04 user nova-compute[70954]: xlnx-zynqmp-pmu Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: hvm Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: 32 Apr 21 10:43:04 user nova-compute[70954]: /usr/bin/qemu-system-microblazeel Apr 21 10:43:04 user nova-compute[70954]: petalogix-s3adsp1800 Apr 21 10:43:04 user nova-compute[70954]: petalogix-ml605 Apr 21 10:43:04 user nova-compute[70954]: xlnx-zynqmp-pmu Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: hvm Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: 32 Apr 21 10:43:04 user nova-compute[70954]: /usr/bin/qemu-system-mips Apr 21 10:43:04 user nova-compute[70954]: malta Apr 21 10:43:04 user nova-compute[70954]: mipssim Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: hvm Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: 32 Apr 21 10:43:04 user nova-compute[70954]: /usr/bin/qemu-system-mipsel Apr 21 10:43:04 user nova-compute[70954]: malta Apr 21 10:43:04 user nova-compute[70954]: mipssim Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: hvm Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: 64 Apr 21 10:43:04 user nova-compute[70954]: /usr/bin/qemu-system-mips64 Apr 21 10:43:04 user nova-compute[70954]: malta Apr 21 10:43:04 user nova-compute[70954]: mipssim Apr 21 10:43:04 user nova-compute[70954]: pica61 Apr 21 10:43:04 user nova-compute[70954]: magnum Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: hvm Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: 64 Apr 21 10:43:04 user nova-compute[70954]: /usr/bin/qemu-system-mips64el Apr 21 10:43:04 user nova-compute[70954]: malta Apr 21 10:43:04 user nova-compute[70954]: loongson3-virt Apr 21 10:43:04 user nova-compute[70954]: mipssim Apr 21 10:43:04 user nova-compute[70954]: pica61 Apr 21 10:43:04 user nova-compute[70954]: magnum Apr 21 10:43:04 user nova-compute[70954]: boston Apr 21 10:43:04 user nova-compute[70954]: fuloong2e Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: hvm Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: 32 Apr 21 10:43:04 user nova-compute[70954]: /usr/bin/qemu-system-ppc Apr 21 10:43:04 user nova-compute[70954]: g3beige Apr 21 10:43:04 user nova-compute[70954]: virtex-ml507 Apr 21 10:43:04 user nova-compute[70954]: mac99 Apr 21 10:43:04 user nova-compute[70954]: ppce500 Apr 21 10:43:04 user nova-compute[70954]: pegasos2 Apr 21 10:43:04 user nova-compute[70954]: sam460ex Apr 21 10:43:04 user nova-compute[70954]: bamboo Apr 21 10:43:04 user nova-compute[70954]: 40p Apr 21 10:43:04 user nova-compute[70954]: ref405ep Apr 21 10:43:04 user nova-compute[70954]: mpc8544ds Apr 21 10:43:04 user nova-compute[70954]: taihu Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: hvm Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: 64 Apr 21 10:43:04 user nova-compute[70954]: /usr/bin/qemu-system-ppc64 Apr 21 10:43:04 user nova-compute[70954]: pseries-jammy Apr 21 10:43:04 user nova-compute[70954]: pseries Apr 21 10:43:04 user nova-compute[70954]: powernv9 Apr 21 10:43:04 user nova-compute[70954]: powernv Apr 21 10:43:04 user nova-compute[70954]: taihu Apr 21 10:43:04 user nova-compute[70954]: pseries-4.1 Apr 21 10:43:04 user nova-compute[70954]: mpc8544ds Apr 21 10:43:04 user nova-compute[70954]: pseries-6.1 Apr 21 10:43:04 user nova-compute[70954]: pseries-2.5 Apr 21 10:43:04 user nova-compute[70954]: powernv10 Apr 21 10:43:04 user nova-compute[70954]: pseries-xenial Apr 21 10:43:04 user nova-compute[70954]: pseries-4.2 Apr 21 10:43:04 user nova-compute[70954]: pseries-6.2 Apr 21 10:43:04 user nova-compute[70954]: pseries-yakkety Apr 21 10:43:04 user nova-compute[70954]: pseries-2.6 Apr 21 10:43:04 user nova-compute[70954]: ppce500 Apr 21 10:43:04 user nova-compute[70954]: pseries-bionic-sxxm Apr 21 10:43:04 user nova-compute[70954]: pseries-2.7 Apr 21 10:43:04 user nova-compute[70954]: pseries-3.0 Apr 21 10:43:04 user nova-compute[70954]: pseries-5.0 Apr 21 10:43:04 user nova-compute[70954]: 40p Apr 21 10:43:04 user nova-compute[70954]: pseries-2.8 Apr 21 10:43:04 user nova-compute[70954]: pegasos2 Apr 21 10:43:04 user nova-compute[70954]: pseries-hirsute Apr 21 10:43:04 user nova-compute[70954]: pseries-3.1 Apr 21 10:43:04 user nova-compute[70954]: pseries-5.1 Apr 21 10:43:04 user nova-compute[70954]: pseries-eoan Apr 21 10:43:04 user nova-compute[70954]: pseries-2.9 Apr 21 10:43:04 user nova-compute[70954]: pseries-zesty Apr 21 10:43:04 user nova-compute[70954]: bamboo Apr 21 10:43:04 user nova-compute[70954]: pseries-groovy Apr 21 10:43:04 user nova-compute[70954]: pseries-focal Apr 21 10:43:04 user nova-compute[70954]: g3beige Apr 21 10:43:04 user nova-compute[70954]: pseries-5.2 Apr 21 10:43:04 user nova-compute[70954]: pseries-disco Apr 21 10:43:04 user nova-compute[70954]: pseries-2.12-sxxm Apr 21 10:43:04 user nova-compute[70954]: pseries-2.10 Apr 21 10:43:04 user nova-compute[70954]: virtex-ml507 Apr 21 10:43:04 user nova-compute[70954]: pseries-2.11 Apr 21 10:43:04 user nova-compute[70954]: pseries-2.1 Apr 21 10:43:04 user nova-compute[70954]: pseries-cosmic Apr 21 10:43:04 user nova-compute[70954]: pseries-bionic Apr 21 10:43:04 user nova-compute[70954]: pseries-2.12 Apr 21 10:43:04 user nova-compute[70954]: pseries-2.2 Apr 21 10:43:04 user nova-compute[70954]: mac99 Apr 21 10:43:04 user nova-compute[70954]: pseries-impish Apr 21 10:43:04 user nova-compute[70954]: pseries-artful Apr 21 10:43:04 user nova-compute[70954]: sam460ex Apr 21 10:43:04 user nova-compute[70954]: ref405ep Apr 21 10:43:04 user nova-compute[70954]: pseries-2.3 Apr 21 10:43:04 user nova-compute[70954]: powernv8 Apr 21 10:43:04 user nova-compute[70954]: pseries-4.0 Apr 21 10:43:04 user nova-compute[70954]: pseries-6.0 Apr 21 10:43:04 user nova-compute[70954]: pseries-2.4 Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: hvm Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: 64 Apr 21 10:43:04 user nova-compute[70954]: /usr/bin/qemu-system-ppc64le Apr 21 10:43:04 user nova-compute[70954]: pseries-jammy Apr 21 10:43:04 user nova-compute[70954]: pseries Apr 21 10:43:04 user nova-compute[70954]: powernv9 Apr 21 10:43:04 user nova-compute[70954]: powernv Apr 21 10:43:04 user nova-compute[70954]: taihu Apr 21 10:43:04 user nova-compute[70954]: pseries-4.1 Apr 21 10:43:04 user nova-compute[70954]: mpc8544ds Apr 21 10:43:04 user nova-compute[70954]: pseries-6.1 Apr 21 10:43:04 user nova-compute[70954]: pseries-2.5 Apr 21 10:43:04 user nova-compute[70954]: powernv10 Apr 21 10:43:04 user nova-compute[70954]: pseries-xenial Apr 21 10:43:04 user nova-compute[70954]: pseries-4.2 Apr 21 10:43:04 user nova-compute[70954]: pseries-6.2 Apr 21 10:43:04 user nova-compute[70954]: pseries-yakkety Apr 21 10:43:04 user nova-compute[70954]: pseries-2.6 Apr 21 10:43:04 user nova-compute[70954]: ppce500 Apr 21 10:43:04 user nova-compute[70954]: pseries-bionic-sxxm Apr 21 10:43:04 user nova-compute[70954]: pseries-2.7 Apr 21 10:43:04 user nova-compute[70954]: pseries-3.0 Apr 21 10:43:04 user nova-compute[70954]: pseries-5.0 Apr 21 10:43:04 user nova-compute[70954]: 40p Apr 21 10:43:04 user nova-compute[70954]: pseries-2.8 Apr 21 10:43:04 user nova-compute[70954]: pegasos2 Apr 21 10:43:04 user nova-compute[70954]: pseries-hirsute Apr 21 10:43:04 user nova-compute[70954]: pseries-3.1 Apr 21 10:43:04 user nova-compute[70954]: pseries-5.1 Apr 21 10:43:04 user nova-compute[70954]: pseries-eoan Apr 21 10:43:04 user nova-compute[70954]: pseries-2.9 Apr 21 10:43:04 user nova-compute[70954]: pseries-zesty Apr 21 10:43:04 user nova-compute[70954]: bamboo Apr 21 10:43:04 user nova-compute[70954]: pseries-groovy Apr 21 10:43:04 user nova-compute[70954]: pseries-focal Apr 21 10:43:04 user nova-compute[70954]: g3beige Apr 21 10:43:04 user nova-compute[70954]: pseries-5.2 Apr 21 10:43:04 user nova-compute[70954]: pseries-disco Apr 21 10:43:04 user nova-compute[70954]: pseries-2.12-sxxm Apr 21 10:43:04 user nova-compute[70954]: pseries-2.10 Apr 21 10:43:04 user nova-compute[70954]: virtex-ml507 Apr 21 10:43:04 user nova-compute[70954]: pseries-2.11 Apr 21 10:43:04 user nova-compute[70954]: pseries-2.1 Apr 21 10:43:04 user nova-compute[70954]: pseries-cosmic Apr 21 10:43:04 user nova-compute[70954]: pseries-bionic Apr 21 10:43:04 user nova-compute[70954]: pseries-2.12 Apr 21 10:43:04 user nova-compute[70954]: pseries-2.2 Apr 21 10:43:04 user nova-compute[70954]: mac99 Apr 21 10:43:04 user nova-compute[70954]: pseries-impish Apr 21 10:43:04 user nova-compute[70954]: pseries-artful Apr 21 10:43:04 user nova-compute[70954]: sam460ex Apr 21 10:43:04 user nova-compute[70954]: ref405ep Apr 21 10:43:04 user nova-compute[70954]: pseries-2.3 Apr 21 10:43:04 user nova-compute[70954]: powernv8 Apr 21 10:43:04 user nova-compute[70954]: pseries-4.0 Apr 21 10:43:04 user nova-compute[70954]: pseries-6.0 Apr 21 10:43:04 user nova-compute[70954]: pseries-2.4 Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: hvm Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: 32 Apr 21 10:43:04 user nova-compute[70954]: /usr/bin/qemu-system-riscv32 Apr 21 10:43:04 user nova-compute[70954]: spike Apr 21 10:43:04 user nova-compute[70954]: opentitan Apr 21 10:43:04 user nova-compute[70954]: sifive_u Apr 21 10:43:04 user nova-compute[70954]: sifive_e Apr 21 10:43:04 user nova-compute[70954]: virt Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: hvm Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: 64 Apr 21 10:43:04 user nova-compute[70954]: /usr/bin/qemu-system-riscv64 Apr 21 10:43:04 user nova-compute[70954]: spike Apr 21 10:43:04 user nova-compute[70954]: microchip-icicle-kit Apr 21 10:43:04 user nova-compute[70954]: sifive_u Apr 21 10:43:04 user nova-compute[70954]: shakti_c Apr 21 10:43:04 user nova-compute[70954]: sifive_e Apr 21 10:43:04 user nova-compute[70954]: virt Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: hvm Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: 64 Apr 21 10:43:04 user nova-compute[70954]: /usr/bin/qemu-system-s390x Apr 21 10:43:04 user nova-compute[70954]: s390-ccw-virtio-jammy Apr 21 10:43:04 user nova-compute[70954]: s390-ccw-virtio Apr 21 10:43:04 user nova-compute[70954]: s390-ccw-virtio-4.0 Apr 21 10:43:04 user nova-compute[70954]: s390-ccw-virtio-5.2 Apr 21 10:43:04 user nova-compute[70954]: s390-ccw-virtio-artful Apr 21 10:43:04 user nova-compute[70954]: s390-ccw-virtio-3.1 Apr 21 10:43:04 user nova-compute[70954]: s390-ccw-virtio-groovy Apr 21 10:43:04 user nova-compute[70954]: s390-ccw-virtio-hirsute Apr 21 10:43:04 user nova-compute[70954]: s390-ccw-virtio-disco Apr 21 10:43:04 user nova-compute[70954]: s390-ccw-virtio-2.12 Apr 21 10:43:04 user nova-compute[70954]: s390-ccw-virtio-2.6 Apr 21 10:43:04 user nova-compute[70954]: s390-ccw-virtio-yakkety Apr 21 10:43:04 user nova-compute[70954]: s390-ccw-virtio-eoan Apr 21 10:43:04 user nova-compute[70954]: s390-ccw-virtio-2.9 Apr 21 10:43:04 user nova-compute[70954]: s390-ccw-virtio-6.0 Apr 21 10:43:04 user nova-compute[70954]: s390-ccw-virtio-5.1 Apr 21 10:43:04 user nova-compute[70954]: s390-ccw-virtio-3.0 Apr 21 10:43:04 user nova-compute[70954]: s390-ccw-virtio-4.2 Apr 21 10:43:04 user nova-compute[70954]: s390-ccw-virtio-2.5 Apr 21 10:43:04 user nova-compute[70954]: s390-ccw-virtio-2.11 Apr 21 10:43:04 user nova-compute[70954]: s390-ccw-virtio-xenial Apr 21 10:43:04 user nova-compute[70954]: s390-ccw-virtio-focal Apr 21 10:43:04 user nova-compute[70954]: s390-ccw-virtio-2.8 Apr 21 10:43:04 user nova-compute[70954]: s390-ccw-virtio-impish Apr 21 10:43:04 user nova-compute[70954]: s390-ccw-virtio-bionic Apr 21 10:43:04 user nova-compute[70954]: s390-ccw-virtio-5.0 Apr 21 10:43:04 user nova-compute[70954]: s390-ccw-virtio-6.2 Apr 21 10:43:04 user nova-compute[70954]: s390-ccw-virtio-zesty Apr 21 10:43:04 user nova-compute[70954]: s390-ccw-virtio-4.1 Apr 21 10:43:04 user nova-compute[70954]: s390-ccw-virtio-cosmic Apr 21 10:43:04 user nova-compute[70954]: s390-ccw-virtio-2.4 Apr 21 10:43:04 user nova-compute[70954]: s390-ccw-virtio-2.10 Apr 21 10:43:04 user nova-compute[70954]: s390-ccw-virtio-2.7 Apr 21 10:43:04 user nova-compute[70954]: s390-ccw-virtio-6.1 Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: hvm Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: 32 Apr 21 10:43:04 user nova-compute[70954]: /usr/bin/qemu-system-sh4 Apr 21 10:43:04 user nova-compute[70954]: shix Apr 21 10:43:04 user nova-compute[70954]: r2d Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: hvm Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: 64 Apr 21 10:43:04 user nova-compute[70954]: /usr/bin/qemu-system-sh4eb Apr 21 10:43:04 user nova-compute[70954]: shix Apr 21 10:43:04 user nova-compute[70954]: r2d Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: hvm Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: 32 Apr 21 10:43:04 user nova-compute[70954]: /usr/bin/qemu-system-sparc Apr 21 10:43:04 user nova-compute[70954]: SS-5 Apr 21 10:43:04 user nova-compute[70954]: SS-20 Apr 21 10:43:04 user nova-compute[70954]: LX Apr 21 10:43:04 user nova-compute[70954]: SPARCClassic Apr 21 10:43:04 user nova-compute[70954]: leon3_generic Apr 21 10:43:04 user nova-compute[70954]: SPARCbook Apr 21 10:43:04 user nova-compute[70954]: SS-4 Apr 21 10:43:04 user nova-compute[70954]: SS-600MP Apr 21 10:43:04 user nova-compute[70954]: SS-10 Apr 21 10:43:04 user nova-compute[70954]: Voyager Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: hvm Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: 64 Apr 21 10:43:04 user nova-compute[70954]: /usr/bin/qemu-system-sparc64 Apr 21 10:43:04 user nova-compute[70954]: sun4u Apr 21 10:43:04 user nova-compute[70954]: niagara Apr 21 10:43:04 user nova-compute[70954]: sun4v Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: hvm Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: 64 Apr 21 10:43:04 user nova-compute[70954]: /usr/bin/qemu-system-x86_64 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-jammy Apr 21 10:43:04 user nova-compute[70954]: ubuntu Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-impish-hpb Apr 21 10:43:04 user nova-compute[70954]: pc-q35-5.2 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-2.12 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-2.0 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-xenial Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-6.2 Apr 21 10:43:04 user nova-compute[70954]: pc Apr 21 10:43:04 user nova-compute[70954]: pc-q35-4.2 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-2.5 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-4.2 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-hirsute Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-focal Apr 21 10:43:04 user nova-compute[70954]: pc-q35-xenial Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-jammy-hpb Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-5.2 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-1.5 Apr 21 10:43:04 user nova-compute[70954]: pc-q35-2.7 Apr 21 10:43:04 user nova-compute[70954]: pc-q35-eoan-hpb Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-zesty Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-disco-hpb Apr 21 10:43:04 user nova-compute[70954]: pc-q35-groovy Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-groovy Apr 21 10:43:04 user nova-compute[70954]: pc-q35-artful Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-trusty Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-2.2 Apr 21 10:43:04 user nova-compute[70954]: pc-q35-focal-hpb Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-eoan-hpb Apr 21 10:43:04 user nova-compute[70954]: pc-q35-bionic-hpb Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-artful Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-2.7 Apr 21 10:43:04 user nova-compute[70954]: pc-q35-6.1 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-yakkety Apr 21 10:43:04 user nova-compute[70954]: pc-q35-2.4 Apr 21 10:43:04 user nova-compute[70954]: pc-q35-cosmic-hpb Apr 21 10:43:04 user nova-compute[70954]: pc-q35-2.10 Apr 21 10:43:04 user nova-compute[70954]: x-remote Apr 21 10:43:04 user nova-compute[70954]: pc-q35-5.1 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-1.7 Apr 21 10:43:04 user nova-compute[70954]: pc-q35-2.9 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-2.11 Apr 21 10:43:04 user nova-compute[70954]: pc-q35-3.1 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-6.1 Apr 21 10:43:04 user nova-compute[70954]: pc-q35-4.1 Apr 21 10:43:04 user nova-compute[70954]: pc-q35-jammy Apr 21 10:43:04 user nova-compute[70954]: ubuntu-q35 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-2.4 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-4.1 Apr 21 10:43:04 user nova-compute[70954]: pc-q35-eoan Apr 21 10:43:04 user nova-compute[70954]: pc-q35-jammy-hpb Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-5.1 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-2.9 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-bionic-hpb Apr 21 10:43:04 user nova-compute[70954]: isapc Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-1.4 Apr 21 10:43:04 user nova-compute[70954]: pc-q35-cosmic Apr 21 10:43:04 user nova-compute[70954]: pc-q35-2.6 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-3.1 Apr 21 10:43:04 user nova-compute[70954]: pc-q35-bionic Apr 21 10:43:04 user nova-compute[70954]: pc-q35-disco-hpb Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-cosmic Apr 21 10:43:04 user nova-compute[70954]: pc-q35-2.12 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-bionic Apr 21 10:43:04 user nova-compute[70954]: pc-q35-groovy-hpb Apr 21 10:43:04 user nova-compute[70954]: pc-q35-disco Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-cosmic-hpb Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-2.1 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-wily Apr 21 10:43:04 user nova-compute[70954]: pc-q35-impish Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-2.6 Apr 21 10:43:04 user nova-compute[70954]: pc-q35-6.0 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-impish Apr 21 10:43:04 user nova-compute[70954]: pc-q35-impish-hpb Apr 21 10:43:04 user nova-compute[70954]: pc-q35-hirsute Apr 21 10:43:04 user nova-compute[70954]: pc-q35-4.0.1 Apr 21 10:43:04 user nova-compute[70954]: pc-q35-hirsute-hpb Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-1.6 Apr 21 10:43:04 user nova-compute[70954]: pc-q35-5.0 Apr 21 10:43:04 user nova-compute[70954]: pc-q35-2.8 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-2.10 Apr 21 10:43:04 user nova-compute[70954]: pc-q35-3.0 Apr 21 10:43:04 user nova-compute[70954]: pc-q35-zesty Apr 21 10:43:04 user nova-compute[70954]: pc-q35-4.0 Apr 21 10:43:04 user nova-compute[70954]: pc-q35-focal Apr 21 10:43:04 user nova-compute[70954]: microvm Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-6.0 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-2.3 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-disco Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-focal-hpb Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-4.0 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-groovy-hpb Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-hirsute-hpb Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-5.0 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-2.8 Apr 21 10:43:04 user nova-compute[70954]: pc-q35-6.2 Apr 21 10:43:04 user nova-compute[70954]: q35 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-eoan Apr 21 10:43:04 user nova-compute[70954]: pc-q35-2.5 Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-3.0 Apr 21 10:43:04 user nova-compute[70954]: pc-q35-yakkety Apr 21 10:43:04 user nova-compute[70954]: pc-q35-2.11 Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: hvm Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: 32 Apr 21 10:43:04 user nova-compute[70954]: /usr/bin/qemu-system-xtensa Apr 21 10:43:04 user nova-compute[70954]: sim Apr 21 10:43:04 user nova-compute[70954]: kc705 Apr 21 10:43:04 user nova-compute[70954]: ml605 Apr 21 10:43:04 user nova-compute[70954]: ml605-nommu Apr 21 10:43:04 user nova-compute[70954]: virt Apr 21 10:43:04 user nova-compute[70954]: lx60-nommu Apr 21 10:43:04 user nova-compute[70954]: lx200 Apr 21 10:43:04 user nova-compute[70954]: lx200-nommu Apr 21 10:43:04 user nova-compute[70954]: lx60 Apr 21 10:43:04 user nova-compute[70954]: kc705-nommu Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: hvm Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: 32 Apr 21 10:43:04 user nova-compute[70954]: /usr/bin/qemu-system-xtensaeb Apr 21 10:43:04 user nova-compute[70954]: sim Apr 21 10:43:04 user nova-compute[70954]: kc705 Apr 21 10:43:04 user nova-compute[70954]: ml605 Apr 21 10:43:04 user nova-compute[70954]: ml605-nommu Apr 21 10:43:04 user nova-compute[70954]: virt Apr 21 10:43:04 user nova-compute[70954]: lx60-nommu Apr 21 10:43:04 user nova-compute[70954]: lx200 Apr 21 10:43:04 user nova-compute[70954]: lx200-nommu Apr 21 10:43:04 user nova-compute[70954]: lx60 Apr 21 10:43:04 user nova-compute[70954]: kc705-nommu Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Getting domain capabilities for alpha via machine types: {None} {{(pid=70954) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Error from libvirt when retrieving domain capabilities for arch alpha / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-alpha' on this host {{(pid=70954) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Getting domain capabilities for armv6l via machine types: {'virt', None} {{(pid=70954) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Error from libvirt when retrieving domain capabilities for arch armv6l / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-arm' on this host {{(pid=70954) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Error from libvirt when retrieving domain capabilities for arch armv6l / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-arm' on this host {{(pid=70954) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Getting domain capabilities for armv7l via machine types: {'virt'} {{(pid=70954) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Error from libvirt when retrieving domain capabilities for arch armv7l / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-arm' on this host {{(pid=70954) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Getting domain capabilities for aarch64 via machine types: {'virt'} {{(pid=70954) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Error from libvirt when retrieving domain capabilities for arch aarch64 / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-aarch64' on this host {{(pid=70954) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Getting domain capabilities for cris via machine types: {None} {{(pid=70954) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Error from libvirt when retrieving domain capabilities for arch cris / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-cris' on this host {{(pid=70954) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Getting domain capabilities for i686 via machine types: {'ubuntu-q35', 'q35', 'ubuntu', 'pc'} {{(pid=70954) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=ubuntu-q35: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: /usr/bin/qemu-system-i386 Apr 21 10:43:04 user nova-compute[70954]: kvm Apr 21 10:43:04 user nova-compute[70954]: pc-q35-jammy Apr 21 10:43:04 user nova-compute[70954]: i686 Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: /usr/share/OVMF/OVMF_CODE.fd Apr 21 10:43:04 user nova-compute[70954]: /usr/share/OVMF/OVMF_CODE.secboot.fd Apr 21 10:43:04 user nova-compute[70954]: /usr/share/AAVMF/AAVMF_CODE.fd Apr 21 10:43:04 user nova-compute[70954]: /usr/share/AAVMF/AAVMF32_CODE.fd Apr 21 10:43:04 user nova-compute[70954]: /usr/share/OVMF/OVMF_CODE.ms.fd Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: rom Apr 21 10:43:04 user nova-compute[70954]: pflash Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: yes Apr 21 10:43:04 user nova-compute[70954]: no Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: no Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: off Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: on Apr 21 10:43:04 user nova-compute[70954]: off Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: IvyBridge-IBRS Apr 21 10:43:04 user nova-compute[70954]: Intel Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: qemu64 Apr 21 10:43:04 user nova-compute[70954]: qemu32 Apr 21 10:43:04 user nova-compute[70954]: phenom Apr 21 10:43:04 user nova-compute[70954]: pentium3 Apr 21 10:43:04 user nova-compute[70954]: pentium2 Apr 21 10:43:04 user nova-compute[70954]: pentium Apr 21 10:43:04 user nova-compute[70954]: n270 Apr 21 10:43:04 user nova-compute[70954]: kvm64 Apr 21 10:43:04 user nova-compute[70954]: kvm32 Apr 21 10:43:04 user nova-compute[70954]: coreduo Apr 21 10:43:04 user nova-compute[70954]: core2duo Apr 21 10:43:04 user nova-compute[70954]: athlon Apr 21 10:43:04 user nova-compute[70954]: Westmere-IBRS Apr 21 10:43:04 user nova-compute[70954]: Westmere Apr 21 10:43:04 user nova-compute[70954]: Snowridge Apr 21 10:43:04 user nova-compute[70954]: Skylake-Server-noTSX-IBRS Apr 21 10:43:04 user nova-compute[70954]: Skylake-Server-IBRS Apr 21 10:43:04 user nova-compute[70954]: Skylake-Server Apr 21 10:43:04 user nova-compute[70954]: Skylake-Client-noTSX-IBRS Apr 21 10:43:04 user nova-compute[70954]: Skylake-Client-IBRS Apr 21 10:43:04 user nova-compute[70954]: Skylake-Client Apr 21 10:43:04 user nova-compute[70954]: SandyBridge-IBRS Apr 21 10:43:04 user nova-compute[70954]: SandyBridge Apr 21 10:43:04 user nova-compute[70954]: Penryn Apr 21 10:43:04 user nova-compute[70954]: Opteron_G5 Apr 21 10:43:04 user nova-compute[70954]: Opteron_G4 Apr 21 10:43:04 user nova-compute[70954]: Opteron_G3 Apr 21 10:43:04 user nova-compute[70954]: Opteron_G2 Apr 21 10:43:04 user nova-compute[70954]: Opteron_G1 Apr 21 10:43:04 user nova-compute[70954]: Nehalem-IBRS Apr 21 10:43:04 user nova-compute[70954]: Nehalem Apr 21 10:43:04 user nova-compute[70954]: IvyBridge-IBRS Apr 21 10:43:04 user nova-compute[70954]: IvyBridge Apr 21 10:43:04 user nova-compute[70954]: Icelake-Server-noTSX Apr 21 10:43:04 user nova-compute[70954]: Icelake-Server Apr 21 10:43:04 user nova-compute[70954]: Icelake-Client-noTSX Apr 21 10:43:04 user nova-compute[70954]: Icelake-Client Apr 21 10:43:04 user nova-compute[70954]: Haswell-noTSX-IBRS Apr 21 10:43:04 user nova-compute[70954]: Haswell-noTSX Apr 21 10:43:04 user nova-compute[70954]: Haswell-IBRS Apr 21 10:43:04 user nova-compute[70954]: Haswell Apr 21 10:43:04 user nova-compute[70954]: EPYC-Rome Apr 21 10:43:04 user nova-compute[70954]: EPYC-Milan Apr 21 10:43:04 user nova-compute[70954]: EPYC-IBPB Apr 21 10:43:04 user nova-compute[70954]: EPYC Apr 21 10:43:04 user nova-compute[70954]: Dhyana Apr 21 10:43:04 user nova-compute[70954]: Cooperlake Apr 21 10:43:04 user nova-compute[70954]: Conroe Apr 21 10:43:04 user nova-compute[70954]: Cascadelake-Server-noTSX Apr 21 10:43:04 user nova-compute[70954]: Cascadelake-Server Apr 21 10:43:04 user nova-compute[70954]: Broadwell-noTSX-IBRS Apr 21 10:43:04 user nova-compute[70954]: Broadwell-noTSX Apr 21 10:43:04 user nova-compute[70954]: Broadwell-IBRS Apr 21 10:43:04 user nova-compute[70954]: Broadwell Apr 21 10:43:04 user nova-compute[70954]: 486 Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: file Apr 21 10:43:04 user nova-compute[70954]: anonymous Apr 21 10:43:04 user nova-compute[70954]: memfd Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: disk Apr 21 10:43:04 user nova-compute[70954]: cdrom Apr 21 10:43:04 user nova-compute[70954]: floppy Apr 21 10:43:04 user nova-compute[70954]: lun Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: fdc Apr 21 10:43:04 user nova-compute[70954]: scsi Apr 21 10:43:04 user nova-compute[70954]: virtio Apr 21 10:43:04 user nova-compute[70954]: usb Apr 21 10:43:04 user nova-compute[70954]: sata Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: virtio Apr 21 10:43:04 user nova-compute[70954]: virtio-transitional Apr 21 10:43:04 user nova-compute[70954]: virtio-non-transitional Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: sdl Apr 21 10:43:04 user nova-compute[70954]: vnc Apr 21 10:43:04 user nova-compute[70954]: spice Apr 21 10:43:04 user nova-compute[70954]: egl-headless Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: subsystem Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: default Apr 21 10:43:04 user nova-compute[70954]: mandatory Apr 21 10:43:04 user nova-compute[70954]: requisite Apr 21 10:43:04 user nova-compute[70954]: optional Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: usb Apr 21 10:43:04 user nova-compute[70954]: pci Apr 21 10:43:04 user nova-compute[70954]: scsi Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: virtio Apr 21 10:43:04 user nova-compute[70954]: virtio-transitional Apr 21 10:43:04 user nova-compute[70954]: virtio-non-transitional Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: random Apr 21 10:43:04 user nova-compute[70954]: egd Apr 21 10:43:04 user nova-compute[70954]: builtin Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: path Apr 21 10:43:04 user nova-compute[70954]: handle Apr 21 10:43:04 user nova-compute[70954]: virtiofs Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: tpm-tis Apr 21 10:43:04 user nova-compute[70954]: tpm-crb Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: passthrough Apr 21 10:43:04 user nova-compute[70954]: emulator Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: {{(pid=70954) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: /usr/bin/qemu-system-i386 Apr 21 10:43:04 user nova-compute[70954]: kvm Apr 21 10:43:04 user nova-compute[70954]: pc-q35-6.2 Apr 21 10:43:04 user nova-compute[70954]: i686 Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: /usr/share/OVMF/OVMF_CODE.fd Apr 21 10:43:04 user nova-compute[70954]: /usr/share/OVMF/OVMF_CODE.secboot.fd Apr 21 10:43:04 user nova-compute[70954]: /usr/share/AAVMF/AAVMF_CODE.fd Apr 21 10:43:04 user nova-compute[70954]: /usr/share/AAVMF/AAVMF32_CODE.fd Apr 21 10:43:04 user nova-compute[70954]: /usr/share/OVMF/OVMF_CODE.ms.fd Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: rom Apr 21 10:43:04 user nova-compute[70954]: pflash Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: yes Apr 21 10:43:04 user nova-compute[70954]: no Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: no Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: off Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: on Apr 21 10:43:04 user nova-compute[70954]: off Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: IvyBridge-IBRS Apr 21 10:43:04 user nova-compute[70954]: Intel Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: qemu64 Apr 21 10:43:04 user nova-compute[70954]: qemu32 Apr 21 10:43:04 user nova-compute[70954]: phenom Apr 21 10:43:04 user nova-compute[70954]: pentium3 Apr 21 10:43:04 user nova-compute[70954]: pentium2 Apr 21 10:43:04 user nova-compute[70954]: pentium Apr 21 10:43:04 user nova-compute[70954]: n270 Apr 21 10:43:04 user nova-compute[70954]: kvm64 Apr 21 10:43:04 user nova-compute[70954]: kvm32 Apr 21 10:43:04 user nova-compute[70954]: coreduo Apr 21 10:43:04 user nova-compute[70954]: core2duo Apr 21 10:43:04 user nova-compute[70954]: athlon Apr 21 10:43:04 user nova-compute[70954]: Westmere-IBRS Apr 21 10:43:04 user nova-compute[70954]: Westmere Apr 21 10:43:04 user nova-compute[70954]: Snowridge Apr 21 10:43:04 user nova-compute[70954]: Skylake-Server-noTSX-IBRS Apr 21 10:43:04 user nova-compute[70954]: Skylake-Server-IBRS Apr 21 10:43:04 user nova-compute[70954]: Skylake-Server Apr 21 10:43:04 user nova-compute[70954]: Skylake-Client-noTSX-IBRS Apr 21 10:43:04 user nova-compute[70954]: Skylake-Client-IBRS Apr 21 10:43:04 user nova-compute[70954]: Skylake-Client Apr 21 10:43:04 user nova-compute[70954]: SandyBridge-IBRS Apr 21 10:43:04 user nova-compute[70954]: SandyBridge Apr 21 10:43:04 user nova-compute[70954]: Penryn Apr 21 10:43:04 user nova-compute[70954]: Opteron_G5 Apr 21 10:43:04 user nova-compute[70954]: Opteron_G4 Apr 21 10:43:04 user nova-compute[70954]: Opteron_G3 Apr 21 10:43:04 user nova-compute[70954]: Opteron_G2 Apr 21 10:43:04 user nova-compute[70954]: Opteron_G1 Apr 21 10:43:04 user nova-compute[70954]: Nehalem-IBRS Apr 21 10:43:04 user nova-compute[70954]: Nehalem Apr 21 10:43:04 user nova-compute[70954]: IvyBridge-IBRS Apr 21 10:43:04 user nova-compute[70954]: IvyBridge Apr 21 10:43:04 user nova-compute[70954]: Icelake-Server-noTSX Apr 21 10:43:04 user nova-compute[70954]: Icelake-Server Apr 21 10:43:04 user nova-compute[70954]: Icelake-Client-noTSX Apr 21 10:43:04 user nova-compute[70954]: Icelake-Client Apr 21 10:43:04 user nova-compute[70954]: Haswell-noTSX-IBRS Apr 21 10:43:04 user nova-compute[70954]: Haswell-noTSX Apr 21 10:43:04 user nova-compute[70954]: Haswell-IBRS Apr 21 10:43:04 user nova-compute[70954]: Haswell Apr 21 10:43:04 user nova-compute[70954]: EPYC-Rome Apr 21 10:43:04 user nova-compute[70954]: EPYC-Milan Apr 21 10:43:04 user nova-compute[70954]: EPYC-IBPB Apr 21 10:43:04 user nova-compute[70954]: EPYC Apr 21 10:43:04 user nova-compute[70954]: Dhyana Apr 21 10:43:04 user nova-compute[70954]: Cooperlake Apr 21 10:43:04 user nova-compute[70954]: Conroe Apr 21 10:43:04 user nova-compute[70954]: Cascadelake-Server-noTSX Apr 21 10:43:04 user nova-compute[70954]: Cascadelake-Server Apr 21 10:43:04 user nova-compute[70954]: Broadwell-noTSX-IBRS Apr 21 10:43:04 user nova-compute[70954]: Broadwell-noTSX Apr 21 10:43:04 user nova-compute[70954]: Broadwell-IBRS Apr 21 10:43:04 user nova-compute[70954]: Broadwell Apr 21 10:43:04 user nova-compute[70954]: 486 Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: file Apr 21 10:43:04 user nova-compute[70954]: anonymous Apr 21 10:43:04 user nova-compute[70954]: memfd Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: disk Apr 21 10:43:04 user nova-compute[70954]: cdrom Apr 21 10:43:04 user nova-compute[70954]: floppy Apr 21 10:43:04 user nova-compute[70954]: lun Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: fdc Apr 21 10:43:04 user nova-compute[70954]: scsi Apr 21 10:43:04 user nova-compute[70954]: virtio Apr 21 10:43:04 user nova-compute[70954]: usb Apr 21 10:43:04 user nova-compute[70954]: sata Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: virtio Apr 21 10:43:04 user nova-compute[70954]: virtio-transitional Apr 21 10:43:04 user nova-compute[70954]: virtio-non-transitional Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: sdl Apr 21 10:43:04 user nova-compute[70954]: vnc Apr 21 10:43:04 user nova-compute[70954]: spice Apr 21 10:43:04 user nova-compute[70954]: egl-headless Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: subsystem Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: default Apr 21 10:43:04 user nova-compute[70954]: mandatory Apr 21 10:43:04 user nova-compute[70954]: requisite Apr 21 10:43:04 user nova-compute[70954]: optional Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: usb Apr 21 10:43:04 user nova-compute[70954]: pci Apr 21 10:43:04 user nova-compute[70954]: scsi Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: virtio Apr 21 10:43:04 user nova-compute[70954]: virtio-transitional Apr 21 10:43:04 user nova-compute[70954]: virtio-non-transitional Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: random Apr 21 10:43:04 user nova-compute[70954]: egd Apr 21 10:43:04 user nova-compute[70954]: builtin Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: path Apr 21 10:43:04 user nova-compute[70954]: handle Apr 21 10:43:04 user nova-compute[70954]: virtiofs Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: tpm-tis Apr 21 10:43:04 user nova-compute[70954]: tpm-crb Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: passthrough Apr 21 10:43:04 user nova-compute[70954]: emulator Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: {{(pid=70954) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=ubuntu: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: /usr/bin/qemu-system-i386 Apr 21 10:43:04 user nova-compute[70954]: kvm Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-jammy Apr 21 10:43:04 user nova-compute[70954]: i686 Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: /usr/share/OVMF/OVMF_CODE.fd Apr 21 10:43:04 user nova-compute[70954]: /usr/share/OVMF/OVMF_CODE.secboot.fd Apr 21 10:43:04 user nova-compute[70954]: /usr/share/AAVMF/AAVMF_CODE.fd Apr 21 10:43:04 user nova-compute[70954]: /usr/share/AAVMF/AAVMF32_CODE.fd Apr 21 10:43:04 user nova-compute[70954]: /usr/share/OVMF/OVMF_CODE.ms.fd Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: rom Apr 21 10:43:04 user nova-compute[70954]: pflash Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: yes Apr 21 10:43:04 user nova-compute[70954]: no Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: no Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: off Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: on Apr 21 10:43:04 user nova-compute[70954]: off Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: IvyBridge-IBRS Apr 21 10:43:04 user nova-compute[70954]: Intel Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: qemu64 Apr 21 10:43:04 user nova-compute[70954]: qemu32 Apr 21 10:43:04 user nova-compute[70954]: phenom Apr 21 10:43:04 user nova-compute[70954]: pentium3 Apr 21 10:43:04 user nova-compute[70954]: pentium2 Apr 21 10:43:04 user nova-compute[70954]: pentium Apr 21 10:43:04 user nova-compute[70954]: n270 Apr 21 10:43:04 user nova-compute[70954]: kvm64 Apr 21 10:43:04 user nova-compute[70954]: kvm32 Apr 21 10:43:04 user nova-compute[70954]: coreduo Apr 21 10:43:04 user nova-compute[70954]: core2duo Apr 21 10:43:04 user nova-compute[70954]: athlon Apr 21 10:43:04 user nova-compute[70954]: Westmere-IBRS Apr 21 10:43:04 user nova-compute[70954]: Westmere Apr 21 10:43:04 user nova-compute[70954]: Snowridge Apr 21 10:43:04 user nova-compute[70954]: Skylake-Server-noTSX-IBRS Apr 21 10:43:04 user nova-compute[70954]: Skylake-Server-IBRS Apr 21 10:43:04 user nova-compute[70954]: Skylake-Server Apr 21 10:43:04 user nova-compute[70954]: Skylake-Client-noTSX-IBRS Apr 21 10:43:04 user nova-compute[70954]: Skylake-Client-IBRS Apr 21 10:43:04 user nova-compute[70954]: Skylake-Client Apr 21 10:43:04 user nova-compute[70954]: SandyBridge-IBRS Apr 21 10:43:04 user nova-compute[70954]: SandyBridge Apr 21 10:43:04 user nova-compute[70954]: Penryn Apr 21 10:43:04 user nova-compute[70954]: Opteron_G5 Apr 21 10:43:04 user nova-compute[70954]: Opteron_G4 Apr 21 10:43:04 user nova-compute[70954]: Opteron_G3 Apr 21 10:43:04 user nova-compute[70954]: Opteron_G2 Apr 21 10:43:04 user nova-compute[70954]: Opteron_G1 Apr 21 10:43:04 user nova-compute[70954]: Nehalem-IBRS Apr 21 10:43:04 user nova-compute[70954]: Nehalem Apr 21 10:43:04 user nova-compute[70954]: IvyBridge-IBRS Apr 21 10:43:04 user nova-compute[70954]: IvyBridge Apr 21 10:43:04 user nova-compute[70954]: Icelake-Server-noTSX Apr 21 10:43:04 user nova-compute[70954]: Icelake-Server Apr 21 10:43:04 user nova-compute[70954]: Icelake-Client-noTSX Apr 21 10:43:04 user nova-compute[70954]: Icelake-Client Apr 21 10:43:04 user nova-compute[70954]: Haswell-noTSX-IBRS Apr 21 10:43:04 user nova-compute[70954]: Haswell-noTSX Apr 21 10:43:04 user nova-compute[70954]: Haswell-IBRS Apr 21 10:43:04 user nova-compute[70954]: Haswell Apr 21 10:43:04 user nova-compute[70954]: EPYC-Rome Apr 21 10:43:04 user nova-compute[70954]: EPYC-Milan Apr 21 10:43:04 user nova-compute[70954]: EPYC-IBPB Apr 21 10:43:04 user nova-compute[70954]: EPYC Apr 21 10:43:04 user nova-compute[70954]: Dhyana Apr 21 10:43:04 user nova-compute[70954]: Cooperlake Apr 21 10:43:04 user nova-compute[70954]: Conroe Apr 21 10:43:04 user nova-compute[70954]: Cascadelake-Server-noTSX Apr 21 10:43:04 user nova-compute[70954]: Cascadelake-Server Apr 21 10:43:04 user nova-compute[70954]: Broadwell-noTSX-IBRS Apr 21 10:43:04 user nova-compute[70954]: Broadwell-noTSX Apr 21 10:43:04 user nova-compute[70954]: Broadwell-IBRS Apr 21 10:43:04 user nova-compute[70954]: Broadwell Apr 21 10:43:04 user nova-compute[70954]: 486 Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: file Apr 21 10:43:04 user nova-compute[70954]: anonymous Apr 21 10:43:04 user nova-compute[70954]: memfd Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: disk Apr 21 10:43:04 user nova-compute[70954]: cdrom Apr 21 10:43:04 user nova-compute[70954]: floppy Apr 21 10:43:04 user nova-compute[70954]: lun Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: ide Apr 21 10:43:04 user nova-compute[70954]: fdc Apr 21 10:43:04 user nova-compute[70954]: scsi Apr 21 10:43:04 user nova-compute[70954]: virtio Apr 21 10:43:04 user nova-compute[70954]: usb Apr 21 10:43:04 user nova-compute[70954]: sata Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: virtio Apr 21 10:43:04 user nova-compute[70954]: virtio-transitional Apr 21 10:43:04 user nova-compute[70954]: virtio-non-transitional Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: sdl Apr 21 10:43:04 user nova-compute[70954]: vnc Apr 21 10:43:04 user nova-compute[70954]: spice Apr 21 10:43:04 user nova-compute[70954]: egl-headless Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: subsystem Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: default Apr 21 10:43:04 user nova-compute[70954]: mandatory Apr 21 10:43:04 user nova-compute[70954]: requisite Apr 21 10:43:04 user nova-compute[70954]: optional Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: usb Apr 21 10:43:04 user nova-compute[70954]: pci Apr 21 10:43:04 user nova-compute[70954]: scsi Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: virtio Apr 21 10:43:04 user nova-compute[70954]: virtio-transitional Apr 21 10:43:04 user nova-compute[70954]: virtio-non-transitional Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: random Apr 21 10:43:04 user nova-compute[70954]: egd Apr 21 10:43:04 user nova-compute[70954]: builtin Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: path Apr 21 10:43:04 user nova-compute[70954]: handle Apr 21 10:43:04 user nova-compute[70954]: virtiofs Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: tpm-tis Apr 21 10:43:04 user nova-compute[70954]: tpm-crb Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: passthrough Apr 21 10:43:04 user nova-compute[70954]: emulator Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: {{(pid=70954) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: /usr/bin/qemu-system-i386 Apr 21 10:43:04 user nova-compute[70954]: kvm Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-6.2 Apr 21 10:43:04 user nova-compute[70954]: i686 Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: /usr/share/OVMF/OVMF_CODE.fd Apr 21 10:43:04 user nova-compute[70954]: /usr/share/OVMF/OVMF_CODE.secboot.fd Apr 21 10:43:04 user nova-compute[70954]: /usr/share/AAVMF/AAVMF_CODE.fd Apr 21 10:43:04 user nova-compute[70954]: /usr/share/AAVMF/AAVMF32_CODE.fd Apr 21 10:43:04 user nova-compute[70954]: /usr/share/OVMF/OVMF_CODE.ms.fd Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: rom Apr 21 10:43:04 user nova-compute[70954]: pflash Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: yes Apr 21 10:43:04 user nova-compute[70954]: no Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: no Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: off Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: on Apr 21 10:43:04 user nova-compute[70954]: off Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: IvyBridge-IBRS Apr 21 10:43:04 user nova-compute[70954]: Intel Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: qemu64 Apr 21 10:43:04 user nova-compute[70954]: qemu32 Apr 21 10:43:04 user nova-compute[70954]: phenom Apr 21 10:43:04 user nova-compute[70954]: pentium3 Apr 21 10:43:04 user nova-compute[70954]: pentium2 Apr 21 10:43:04 user nova-compute[70954]: pentium Apr 21 10:43:04 user nova-compute[70954]: n270 Apr 21 10:43:04 user nova-compute[70954]: kvm64 Apr 21 10:43:04 user nova-compute[70954]: kvm32 Apr 21 10:43:04 user nova-compute[70954]: coreduo Apr 21 10:43:04 user nova-compute[70954]: core2duo Apr 21 10:43:04 user nova-compute[70954]: athlon Apr 21 10:43:04 user nova-compute[70954]: Westmere-IBRS Apr 21 10:43:04 user nova-compute[70954]: Westmere Apr 21 10:43:04 user nova-compute[70954]: Snowridge Apr 21 10:43:04 user nova-compute[70954]: Skylake-Server-noTSX-IBRS Apr 21 10:43:04 user nova-compute[70954]: Skylake-Server-IBRS Apr 21 10:43:04 user nova-compute[70954]: Skylake-Server Apr 21 10:43:04 user nova-compute[70954]: Skylake-Client-noTSX-IBRS Apr 21 10:43:04 user nova-compute[70954]: Skylake-Client-IBRS Apr 21 10:43:04 user nova-compute[70954]: Skylake-Client Apr 21 10:43:04 user nova-compute[70954]: SandyBridge-IBRS Apr 21 10:43:04 user nova-compute[70954]: SandyBridge Apr 21 10:43:04 user nova-compute[70954]: Penryn Apr 21 10:43:04 user nova-compute[70954]: Opteron_G5 Apr 21 10:43:04 user nova-compute[70954]: Opteron_G4 Apr 21 10:43:04 user nova-compute[70954]: Opteron_G3 Apr 21 10:43:04 user nova-compute[70954]: Opteron_G2 Apr 21 10:43:04 user nova-compute[70954]: Opteron_G1 Apr 21 10:43:04 user nova-compute[70954]: Nehalem-IBRS Apr 21 10:43:04 user nova-compute[70954]: Nehalem Apr 21 10:43:04 user nova-compute[70954]: IvyBridge-IBRS Apr 21 10:43:04 user nova-compute[70954]: IvyBridge Apr 21 10:43:04 user nova-compute[70954]: Icelake-Server-noTSX Apr 21 10:43:04 user nova-compute[70954]: Icelake-Server Apr 21 10:43:04 user nova-compute[70954]: Icelake-Client-noTSX Apr 21 10:43:04 user nova-compute[70954]: Icelake-Client Apr 21 10:43:04 user nova-compute[70954]: Haswell-noTSX-IBRS Apr 21 10:43:04 user nova-compute[70954]: Haswell-noTSX Apr 21 10:43:04 user nova-compute[70954]: Haswell-IBRS Apr 21 10:43:04 user nova-compute[70954]: Haswell Apr 21 10:43:04 user nova-compute[70954]: EPYC-Rome Apr 21 10:43:04 user nova-compute[70954]: EPYC-Milan Apr 21 10:43:04 user nova-compute[70954]: EPYC-IBPB Apr 21 10:43:04 user nova-compute[70954]: EPYC Apr 21 10:43:04 user nova-compute[70954]: Dhyana Apr 21 10:43:04 user nova-compute[70954]: Cooperlake Apr 21 10:43:04 user nova-compute[70954]: Conroe Apr 21 10:43:04 user nova-compute[70954]: Cascadelake-Server-noTSX Apr 21 10:43:04 user nova-compute[70954]: Cascadelake-Server Apr 21 10:43:04 user nova-compute[70954]: Broadwell-noTSX-IBRS Apr 21 10:43:04 user nova-compute[70954]: Broadwell-noTSX Apr 21 10:43:04 user nova-compute[70954]: Broadwell-IBRS Apr 21 10:43:04 user nova-compute[70954]: Broadwell Apr 21 10:43:04 user nova-compute[70954]: 486 Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: file Apr 21 10:43:04 user nova-compute[70954]: anonymous Apr 21 10:43:04 user nova-compute[70954]: memfd Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: disk Apr 21 10:43:04 user nova-compute[70954]: cdrom Apr 21 10:43:04 user nova-compute[70954]: floppy Apr 21 10:43:04 user nova-compute[70954]: lun Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: ide Apr 21 10:43:04 user nova-compute[70954]: fdc Apr 21 10:43:04 user nova-compute[70954]: scsi Apr 21 10:43:04 user nova-compute[70954]: virtio Apr 21 10:43:04 user nova-compute[70954]: usb Apr 21 10:43:04 user nova-compute[70954]: sata Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: virtio Apr 21 10:43:04 user nova-compute[70954]: virtio-transitional Apr 21 10:43:04 user nova-compute[70954]: virtio-non-transitional Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: sdl Apr 21 10:43:04 user nova-compute[70954]: vnc Apr 21 10:43:04 user nova-compute[70954]: spice Apr 21 10:43:04 user nova-compute[70954]: egl-headless Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: subsystem Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: default Apr 21 10:43:04 user nova-compute[70954]: mandatory Apr 21 10:43:04 user nova-compute[70954]: requisite Apr 21 10:43:04 user nova-compute[70954]: optional Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: usb Apr 21 10:43:04 user nova-compute[70954]: pci Apr 21 10:43:04 user nova-compute[70954]: scsi Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: virtio Apr 21 10:43:04 user nova-compute[70954]: virtio-transitional Apr 21 10:43:04 user nova-compute[70954]: virtio-non-transitional Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: random Apr 21 10:43:04 user nova-compute[70954]: egd Apr 21 10:43:04 user nova-compute[70954]: builtin Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: path Apr 21 10:43:04 user nova-compute[70954]: handle Apr 21 10:43:04 user nova-compute[70954]: virtiofs Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: tpm-tis Apr 21 10:43:04 user nova-compute[70954]: tpm-crb Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: passthrough Apr 21 10:43:04 user nova-compute[70954]: emulator Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: {{(pid=70954) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Getting domain capabilities for m68k via machine types: {'virt', None} {{(pid=70954) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Error from libvirt when retrieving domain capabilities for arch m68k / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-m68k' on this host {{(pid=70954) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Error from libvirt when retrieving domain capabilities for arch m68k / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-m68k' on this host {{(pid=70954) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Getting domain capabilities for microblaze via machine types: {None} {{(pid=70954) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Error from libvirt when retrieving domain capabilities for arch microblaze / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-microblaze' on this host {{(pid=70954) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Getting domain capabilities for microblazeel via machine types: {None} {{(pid=70954) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Error from libvirt when retrieving domain capabilities for arch microblazeel / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-microblazeel' on this host {{(pid=70954) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Getting domain capabilities for mips via machine types: {None} {{(pid=70954) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Error from libvirt when retrieving domain capabilities for arch mips / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-mips' on this host {{(pid=70954) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Getting domain capabilities for mipsel via machine types: {None} {{(pid=70954) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Error from libvirt when retrieving domain capabilities for arch mipsel / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-mipsel' on this host {{(pid=70954) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Getting domain capabilities for mips64 via machine types: {None} {{(pid=70954) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Error from libvirt when retrieving domain capabilities for arch mips64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-mips64' on this host {{(pid=70954) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Getting domain capabilities for mips64el via machine types: {None} {{(pid=70954) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Error from libvirt when retrieving domain capabilities for arch mips64el / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-mips64el' on this host {{(pid=70954) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Getting domain capabilities for ppc via machine types: {None} {{(pid=70954) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Error from libvirt when retrieving domain capabilities for arch ppc / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc' on this host {{(pid=70954) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Getting domain capabilities for ppc64 via machine types: {'pseries', 'powernv', None} {{(pid=70954) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Error from libvirt when retrieving domain capabilities for arch ppc64 / virt_type kvm / machine_type pseries: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64' on this host {{(pid=70954) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Error from libvirt when retrieving domain capabilities for arch ppc64 / virt_type kvm / machine_type powernv: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64' on this host {{(pid=70954) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Error from libvirt when retrieving domain capabilities for arch ppc64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64' on this host {{(pid=70954) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Getting domain capabilities for ppc64le via machine types: {'pseries', 'powernv'} {{(pid=70954) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Error from libvirt when retrieving domain capabilities for arch ppc64le / virt_type kvm / machine_type pseries: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64le' on this host {{(pid=70954) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Error from libvirt when retrieving domain capabilities for arch ppc64le / virt_type kvm / machine_type powernv: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64le' on this host {{(pid=70954) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Getting domain capabilities for riscv32 via machine types: {None} {{(pid=70954) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Error from libvirt when retrieving domain capabilities for arch riscv32 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-riscv32' on this host {{(pid=70954) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Getting domain capabilities for riscv64 via machine types: {None} {{(pid=70954) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Error from libvirt when retrieving domain capabilities for arch riscv64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-riscv64' on this host {{(pid=70954) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Getting domain capabilities for s390x via machine types: {'s390-ccw-virtio'} {{(pid=70954) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Error from libvirt when retrieving domain capabilities for arch s390x / virt_type kvm / machine_type s390-ccw-virtio: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-s390x' on this host {{(pid=70954) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Getting domain capabilities for sh4 via machine types: {None} {{(pid=70954) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Error from libvirt when retrieving domain capabilities for arch sh4 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-sh4' on this host {{(pid=70954) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Getting domain capabilities for sh4eb via machine types: {None} {{(pid=70954) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Error from libvirt when retrieving domain capabilities for arch sh4eb / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-sh4eb' on this host {{(pid=70954) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Getting domain capabilities for sparc via machine types: {None} {{(pid=70954) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Error from libvirt when retrieving domain capabilities for arch sparc / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-sparc' on this host {{(pid=70954) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Getting domain capabilities for sparc64 via machine types: {None} {{(pid=70954) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Error from libvirt when retrieving domain capabilities for arch sparc64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-sparc64' on this host {{(pid=70954) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Getting domain capabilities for x86_64 via machine types: {'ubuntu-q35', 'q35', 'ubuntu', 'pc'} {{(pid=70954) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=ubuntu-q35: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: /usr/bin/qemu-system-x86_64 Apr 21 10:43:04 user nova-compute[70954]: kvm Apr 21 10:43:04 user nova-compute[70954]: pc-q35-jammy Apr 21 10:43:04 user nova-compute[70954]: x86_64 Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: efi Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: /usr/share/OVMF/OVMF_CODE_4M.ms.fd Apr 21 10:43:04 user nova-compute[70954]: /usr/share/OVMF/OVMF_CODE_4M.secboot.fd Apr 21 10:43:04 user nova-compute[70954]: /usr/share/OVMF/OVMF_CODE_4M.fd Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: rom Apr 21 10:43:04 user nova-compute[70954]: pflash Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: yes Apr 21 10:43:04 user nova-compute[70954]: no Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: yes Apr 21 10:43:04 user nova-compute[70954]: no Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: on Apr 21 10:43:04 user nova-compute[70954]: off Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: on Apr 21 10:43:04 user nova-compute[70954]: off Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: IvyBridge-IBRS Apr 21 10:43:04 user nova-compute[70954]: Intel Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: qemu64 Apr 21 10:43:04 user nova-compute[70954]: qemu32 Apr 21 10:43:04 user nova-compute[70954]: phenom Apr 21 10:43:04 user nova-compute[70954]: pentium3 Apr 21 10:43:04 user nova-compute[70954]: pentium2 Apr 21 10:43:04 user nova-compute[70954]: pentium Apr 21 10:43:04 user nova-compute[70954]: n270 Apr 21 10:43:04 user nova-compute[70954]: kvm64 Apr 21 10:43:04 user nova-compute[70954]: kvm32 Apr 21 10:43:04 user nova-compute[70954]: coreduo Apr 21 10:43:04 user nova-compute[70954]: core2duo Apr 21 10:43:04 user nova-compute[70954]: athlon Apr 21 10:43:04 user nova-compute[70954]: Westmere-IBRS Apr 21 10:43:04 user nova-compute[70954]: Westmere Apr 21 10:43:04 user nova-compute[70954]: Snowridge Apr 21 10:43:04 user nova-compute[70954]: Skylake-Server-noTSX-IBRS Apr 21 10:43:04 user nova-compute[70954]: Skylake-Server-IBRS Apr 21 10:43:04 user nova-compute[70954]: Skylake-Server Apr 21 10:43:04 user nova-compute[70954]: Skylake-Client-noTSX-IBRS Apr 21 10:43:04 user nova-compute[70954]: Skylake-Client-IBRS Apr 21 10:43:04 user nova-compute[70954]: Skylake-Client Apr 21 10:43:04 user nova-compute[70954]: SandyBridge-IBRS Apr 21 10:43:04 user nova-compute[70954]: SandyBridge Apr 21 10:43:04 user nova-compute[70954]: Penryn Apr 21 10:43:04 user nova-compute[70954]: Opteron_G5 Apr 21 10:43:04 user nova-compute[70954]: Opteron_G4 Apr 21 10:43:04 user nova-compute[70954]: Opteron_G3 Apr 21 10:43:04 user nova-compute[70954]: Opteron_G2 Apr 21 10:43:04 user nova-compute[70954]: Opteron_G1 Apr 21 10:43:04 user nova-compute[70954]: Nehalem-IBRS Apr 21 10:43:04 user nova-compute[70954]: Nehalem Apr 21 10:43:04 user nova-compute[70954]: IvyBridge-IBRS Apr 21 10:43:04 user nova-compute[70954]: IvyBridge Apr 21 10:43:04 user nova-compute[70954]: Icelake-Server-noTSX Apr 21 10:43:04 user nova-compute[70954]: Icelake-Server Apr 21 10:43:04 user nova-compute[70954]: Icelake-Client-noTSX Apr 21 10:43:04 user nova-compute[70954]: Icelake-Client Apr 21 10:43:04 user nova-compute[70954]: Haswell-noTSX-IBRS Apr 21 10:43:04 user nova-compute[70954]: Haswell-noTSX Apr 21 10:43:04 user nova-compute[70954]: Haswell-IBRS Apr 21 10:43:04 user nova-compute[70954]: Haswell Apr 21 10:43:04 user nova-compute[70954]: EPYC-Rome Apr 21 10:43:04 user nova-compute[70954]: EPYC-Milan Apr 21 10:43:04 user nova-compute[70954]: EPYC-IBPB Apr 21 10:43:04 user nova-compute[70954]: EPYC Apr 21 10:43:04 user nova-compute[70954]: Dhyana Apr 21 10:43:04 user nova-compute[70954]: Cooperlake Apr 21 10:43:04 user nova-compute[70954]: Conroe Apr 21 10:43:04 user nova-compute[70954]: Cascadelake-Server-noTSX Apr 21 10:43:04 user nova-compute[70954]: Cascadelake-Server Apr 21 10:43:04 user nova-compute[70954]: Broadwell-noTSX-IBRS Apr 21 10:43:04 user nova-compute[70954]: Broadwell-noTSX Apr 21 10:43:04 user nova-compute[70954]: Broadwell-IBRS Apr 21 10:43:04 user nova-compute[70954]: Broadwell Apr 21 10:43:04 user nova-compute[70954]: 486 Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: file Apr 21 10:43:04 user nova-compute[70954]: anonymous Apr 21 10:43:04 user nova-compute[70954]: memfd Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: disk Apr 21 10:43:04 user nova-compute[70954]: cdrom Apr 21 10:43:04 user nova-compute[70954]: floppy Apr 21 10:43:04 user nova-compute[70954]: lun Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: fdc Apr 21 10:43:04 user nova-compute[70954]: scsi Apr 21 10:43:04 user nova-compute[70954]: virtio Apr 21 10:43:04 user nova-compute[70954]: usb Apr 21 10:43:04 user nova-compute[70954]: sata Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: virtio Apr 21 10:43:04 user nova-compute[70954]: virtio-transitional Apr 21 10:43:04 user nova-compute[70954]: virtio-non-transitional Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: sdl Apr 21 10:43:04 user nova-compute[70954]: vnc Apr 21 10:43:04 user nova-compute[70954]: spice Apr 21 10:43:04 user nova-compute[70954]: egl-headless Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: subsystem Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: default Apr 21 10:43:04 user nova-compute[70954]: mandatory Apr 21 10:43:04 user nova-compute[70954]: requisite Apr 21 10:43:04 user nova-compute[70954]: optional Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: usb Apr 21 10:43:04 user nova-compute[70954]: pci Apr 21 10:43:04 user nova-compute[70954]: scsi Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: virtio Apr 21 10:43:04 user nova-compute[70954]: virtio-transitional Apr 21 10:43:04 user nova-compute[70954]: virtio-non-transitional Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: random Apr 21 10:43:04 user nova-compute[70954]: egd Apr 21 10:43:04 user nova-compute[70954]: builtin Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: path Apr 21 10:43:04 user nova-compute[70954]: handle Apr 21 10:43:04 user nova-compute[70954]: virtiofs Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: tpm-tis Apr 21 10:43:04 user nova-compute[70954]: tpm-crb Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: passthrough Apr 21 10:43:04 user nova-compute[70954]: emulator Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: {{(pid=70954) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: /usr/bin/qemu-system-x86_64 Apr 21 10:43:04 user nova-compute[70954]: kvm Apr 21 10:43:04 user nova-compute[70954]: pc-q35-6.2 Apr 21 10:43:04 user nova-compute[70954]: x86_64 Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: efi Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: /usr/share/OVMF/OVMF_CODE_4M.ms.fd Apr 21 10:43:04 user nova-compute[70954]: /usr/share/OVMF/OVMF_CODE_4M.secboot.fd Apr 21 10:43:04 user nova-compute[70954]: /usr/share/OVMF/OVMF_CODE_4M.fd Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: rom Apr 21 10:43:04 user nova-compute[70954]: pflash Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: yes Apr 21 10:43:04 user nova-compute[70954]: no Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: yes Apr 21 10:43:04 user nova-compute[70954]: no Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: on Apr 21 10:43:04 user nova-compute[70954]: off Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: on Apr 21 10:43:04 user nova-compute[70954]: off Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: IvyBridge-IBRS Apr 21 10:43:04 user nova-compute[70954]: Intel Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: qemu64 Apr 21 10:43:04 user nova-compute[70954]: qemu32 Apr 21 10:43:04 user nova-compute[70954]: phenom Apr 21 10:43:04 user nova-compute[70954]: pentium3 Apr 21 10:43:04 user nova-compute[70954]: pentium2 Apr 21 10:43:04 user nova-compute[70954]: pentium Apr 21 10:43:04 user nova-compute[70954]: n270 Apr 21 10:43:04 user nova-compute[70954]: kvm64 Apr 21 10:43:04 user nova-compute[70954]: kvm32 Apr 21 10:43:04 user nova-compute[70954]: coreduo Apr 21 10:43:04 user nova-compute[70954]: core2duo Apr 21 10:43:04 user nova-compute[70954]: athlon Apr 21 10:43:04 user nova-compute[70954]: Westmere-IBRS Apr 21 10:43:04 user nova-compute[70954]: Westmere Apr 21 10:43:04 user nova-compute[70954]: Snowridge Apr 21 10:43:04 user nova-compute[70954]: Skylake-Server-noTSX-IBRS Apr 21 10:43:04 user nova-compute[70954]: Skylake-Server-IBRS Apr 21 10:43:04 user nova-compute[70954]: Skylake-Server Apr 21 10:43:04 user nova-compute[70954]: Skylake-Client-noTSX-IBRS Apr 21 10:43:04 user nova-compute[70954]: Skylake-Client-IBRS Apr 21 10:43:04 user nova-compute[70954]: Skylake-Client Apr 21 10:43:04 user nova-compute[70954]: SandyBridge-IBRS Apr 21 10:43:04 user nova-compute[70954]: SandyBridge Apr 21 10:43:04 user nova-compute[70954]: Penryn Apr 21 10:43:04 user nova-compute[70954]: Opteron_G5 Apr 21 10:43:04 user nova-compute[70954]: Opteron_G4 Apr 21 10:43:04 user nova-compute[70954]: Opteron_G3 Apr 21 10:43:04 user nova-compute[70954]: Opteron_G2 Apr 21 10:43:04 user nova-compute[70954]: Opteron_G1 Apr 21 10:43:04 user nova-compute[70954]: Nehalem-IBRS Apr 21 10:43:04 user nova-compute[70954]: Nehalem Apr 21 10:43:04 user nova-compute[70954]: IvyBridge-IBRS Apr 21 10:43:04 user nova-compute[70954]: IvyBridge Apr 21 10:43:04 user nova-compute[70954]: Icelake-Server-noTSX Apr 21 10:43:04 user nova-compute[70954]: Icelake-Server Apr 21 10:43:04 user nova-compute[70954]: Icelake-Client-noTSX Apr 21 10:43:04 user nova-compute[70954]: Icelake-Client Apr 21 10:43:04 user nova-compute[70954]: Haswell-noTSX-IBRS Apr 21 10:43:04 user nova-compute[70954]: Haswell-noTSX Apr 21 10:43:04 user nova-compute[70954]: Haswell-IBRS Apr 21 10:43:04 user nova-compute[70954]: Haswell Apr 21 10:43:04 user nova-compute[70954]: EPYC-Rome Apr 21 10:43:04 user nova-compute[70954]: EPYC-Milan Apr 21 10:43:04 user nova-compute[70954]: EPYC-IBPB Apr 21 10:43:04 user nova-compute[70954]: EPYC Apr 21 10:43:04 user nova-compute[70954]: Dhyana Apr 21 10:43:04 user nova-compute[70954]: Cooperlake Apr 21 10:43:04 user nova-compute[70954]: Conroe Apr 21 10:43:04 user nova-compute[70954]: Cascadelake-Server-noTSX Apr 21 10:43:04 user nova-compute[70954]: Cascadelake-Server Apr 21 10:43:04 user nova-compute[70954]: Broadwell-noTSX-IBRS Apr 21 10:43:04 user nova-compute[70954]: Broadwell-noTSX Apr 21 10:43:04 user nova-compute[70954]: Broadwell-IBRS Apr 21 10:43:04 user nova-compute[70954]: Broadwell Apr 21 10:43:04 user nova-compute[70954]: 486 Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: file Apr 21 10:43:04 user nova-compute[70954]: anonymous Apr 21 10:43:04 user nova-compute[70954]: memfd Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: disk Apr 21 10:43:04 user nova-compute[70954]: cdrom Apr 21 10:43:04 user nova-compute[70954]: floppy Apr 21 10:43:04 user nova-compute[70954]: lun Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: fdc Apr 21 10:43:04 user nova-compute[70954]: scsi Apr 21 10:43:04 user nova-compute[70954]: virtio Apr 21 10:43:04 user nova-compute[70954]: usb Apr 21 10:43:04 user nova-compute[70954]: sata Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: virtio Apr 21 10:43:04 user nova-compute[70954]: virtio-transitional Apr 21 10:43:04 user nova-compute[70954]: virtio-non-transitional Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: sdl Apr 21 10:43:04 user nova-compute[70954]: vnc Apr 21 10:43:04 user nova-compute[70954]: spice Apr 21 10:43:04 user nova-compute[70954]: egl-headless Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: subsystem Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: default Apr 21 10:43:04 user nova-compute[70954]: mandatory Apr 21 10:43:04 user nova-compute[70954]: requisite Apr 21 10:43:04 user nova-compute[70954]: optional Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: usb Apr 21 10:43:04 user nova-compute[70954]: pci Apr 21 10:43:04 user nova-compute[70954]: scsi Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: virtio Apr 21 10:43:04 user nova-compute[70954]: virtio-transitional Apr 21 10:43:04 user nova-compute[70954]: virtio-non-transitional Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: random Apr 21 10:43:04 user nova-compute[70954]: egd Apr 21 10:43:04 user nova-compute[70954]: builtin Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: path Apr 21 10:43:04 user nova-compute[70954]: handle Apr 21 10:43:04 user nova-compute[70954]: virtiofs Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: tpm-tis Apr 21 10:43:04 user nova-compute[70954]: tpm-crb Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: passthrough Apr 21 10:43:04 user nova-compute[70954]: emulator Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: {{(pid=70954) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 21 10:43:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=ubuntu: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: /usr/bin/qemu-system-x86_64 Apr 21 10:43:04 user nova-compute[70954]: kvm Apr 21 10:43:04 user nova-compute[70954]: pc-i440fx-jammy Apr 21 10:43:04 user nova-compute[70954]: x86_64 Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: efi Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: /usr/share/OVMF/OVMF_CODE_4M.fd Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: rom Apr 21 10:43:04 user nova-compute[70954]: pflash Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: yes Apr 21 10:43:04 user nova-compute[70954]: no Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: no Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: on Apr 21 10:43:04 user nova-compute[70954]: off Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: on Apr 21 10:43:04 user nova-compute[70954]: off Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: IvyBridge-IBRS Apr 21 10:43:04 user nova-compute[70954]: Intel Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:04 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: qemu64 Apr 21 10:43:05 user nova-compute[70954]: qemu32 Apr 21 10:43:05 user nova-compute[70954]: phenom Apr 21 10:43:05 user nova-compute[70954]: pentium3 Apr 21 10:43:05 user nova-compute[70954]: pentium2 Apr 21 10:43:05 user nova-compute[70954]: pentium Apr 21 10:43:05 user nova-compute[70954]: n270 Apr 21 10:43:05 user nova-compute[70954]: kvm64 Apr 21 10:43:05 user nova-compute[70954]: kvm32 Apr 21 10:43:05 user nova-compute[70954]: coreduo Apr 21 10:43:05 user nova-compute[70954]: core2duo Apr 21 10:43:05 user nova-compute[70954]: athlon Apr 21 10:43:05 user nova-compute[70954]: Westmere-IBRS Apr 21 10:43:05 user nova-compute[70954]: Westmere Apr 21 10:43:05 user nova-compute[70954]: Snowridge Apr 21 10:43:05 user nova-compute[70954]: Skylake-Server-noTSX-IBRS Apr 21 10:43:05 user nova-compute[70954]: Skylake-Server-IBRS Apr 21 10:43:05 user nova-compute[70954]: Skylake-Server Apr 21 10:43:05 user nova-compute[70954]: Skylake-Client-noTSX-IBRS Apr 21 10:43:05 user nova-compute[70954]: Skylake-Client-IBRS Apr 21 10:43:05 user nova-compute[70954]: Skylake-Client Apr 21 10:43:05 user nova-compute[70954]: SandyBridge-IBRS Apr 21 10:43:05 user nova-compute[70954]: SandyBridge Apr 21 10:43:05 user nova-compute[70954]: Penryn Apr 21 10:43:05 user nova-compute[70954]: Opteron_G5 Apr 21 10:43:05 user nova-compute[70954]: Opteron_G4 Apr 21 10:43:05 user nova-compute[70954]: Opteron_G3 Apr 21 10:43:05 user nova-compute[70954]: Opteron_G2 Apr 21 10:43:05 user nova-compute[70954]: Opteron_G1 Apr 21 10:43:05 user nova-compute[70954]: Nehalem-IBRS Apr 21 10:43:05 user nova-compute[70954]: Nehalem Apr 21 10:43:05 user nova-compute[70954]: IvyBridge-IBRS Apr 21 10:43:05 user nova-compute[70954]: IvyBridge Apr 21 10:43:05 user nova-compute[70954]: Icelake-Server-noTSX Apr 21 10:43:05 user nova-compute[70954]: Icelake-Server Apr 21 10:43:05 user nova-compute[70954]: Icelake-Client-noTSX Apr 21 10:43:05 user nova-compute[70954]: Icelake-Client Apr 21 10:43:05 user nova-compute[70954]: Haswell-noTSX-IBRS Apr 21 10:43:05 user nova-compute[70954]: Haswell-noTSX Apr 21 10:43:05 user nova-compute[70954]: Haswell-IBRS Apr 21 10:43:05 user nova-compute[70954]: Haswell Apr 21 10:43:05 user nova-compute[70954]: EPYC-Rome Apr 21 10:43:05 user nova-compute[70954]: EPYC-Milan Apr 21 10:43:05 user nova-compute[70954]: EPYC-IBPB Apr 21 10:43:05 user nova-compute[70954]: EPYC Apr 21 10:43:05 user nova-compute[70954]: Dhyana Apr 21 10:43:05 user nova-compute[70954]: Cooperlake Apr 21 10:43:05 user nova-compute[70954]: Conroe Apr 21 10:43:05 user nova-compute[70954]: Cascadelake-Server-noTSX Apr 21 10:43:05 user nova-compute[70954]: Cascadelake-Server Apr 21 10:43:05 user nova-compute[70954]: Broadwell-noTSX-IBRS Apr 21 10:43:05 user nova-compute[70954]: Broadwell-noTSX Apr 21 10:43:05 user nova-compute[70954]: Broadwell-IBRS Apr 21 10:43:05 user nova-compute[70954]: Broadwell Apr 21 10:43:05 user nova-compute[70954]: 486 Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: file Apr 21 10:43:05 user nova-compute[70954]: anonymous Apr 21 10:43:05 user nova-compute[70954]: memfd Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: disk Apr 21 10:43:05 user nova-compute[70954]: cdrom Apr 21 10:43:05 user nova-compute[70954]: floppy Apr 21 10:43:05 user nova-compute[70954]: lun Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: ide Apr 21 10:43:05 user nova-compute[70954]: fdc Apr 21 10:43:05 user nova-compute[70954]: scsi Apr 21 10:43:05 user nova-compute[70954]: virtio Apr 21 10:43:05 user nova-compute[70954]: usb Apr 21 10:43:05 user nova-compute[70954]: sata Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: virtio Apr 21 10:43:05 user nova-compute[70954]: virtio-transitional Apr 21 10:43:05 user nova-compute[70954]: virtio-non-transitional Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: sdl Apr 21 10:43:05 user nova-compute[70954]: vnc Apr 21 10:43:05 user nova-compute[70954]: spice Apr 21 10:43:05 user nova-compute[70954]: egl-headless Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: subsystem Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: default Apr 21 10:43:05 user nova-compute[70954]: mandatory Apr 21 10:43:05 user nova-compute[70954]: requisite Apr 21 10:43:05 user nova-compute[70954]: optional Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: usb Apr 21 10:43:05 user nova-compute[70954]: pci Apr 21 10:43:05 user nova-compute[70954]: scsi Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: virtio Apr 21 10:43:05 user nova-compute[70954]: virtio-transitional Apr 21 10:43:05 user nova-compute[70954]: virtio-non-transitional Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: random Apr 21 10:43:05 user nova-compute[70954]: egd Apr 21 10:43:05 user nova-compute[70954]: builtin Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: path Apr 21 10:43:05 user nova-compute[70954]: handle Apr 21 10:43:05 user nova-compute[70954]: virtiofs Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: tpm-tis Apr 21 10:43:05 user nova-compute[70954]: tpm-crb Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: passthrough Apr 21 10:43:05 user nova-compute[70954]: emulator Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: {{(pid=70954) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 21 10:43:05 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: /usr/bin/qemu-system-x86_64 Apr 21 10:43:05 user nova-compute[70954]: kvm Apr 21 10:43:05 user nova-compute[70954]: pc-i440fx-6.2 Apr 21 10:43:05 user nova-compute[70954]: x86_64 Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: efi Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: /usr/share/OVMF/OVMF_CODE_4M.fd Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: rom Apr 21 10:43:05 user nova-compute[70954]: pflash Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: yes Apr 21 10:43:05 user nova-compute[70954]: no Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: no Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: on Apr 21 10:43:05 user nova-compute[70954]: off Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: on Apr 21 10:43:05 user nova-compute[70954]: off Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: IvyBridge-IBRS Apr 21 10:43:05 user nova-compute[70954]: Intel Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: qemu64 Apr 21 10:43:05 user nova-compute[70954]: qemu32 Apr 21 10:43:05 user nova-compute[70954]: phenom Apr 21 10:43:05 user nova-compute[70954]: pentium3 Apr 21 10:43:05 user nova-compute[70954]: pentium2 Apr 21 10:43:05 user nova-compute[70954]: pentium Apr 21 10:43:05 user nova-compute[70954]: n270 Apr 21 10:43:05 user nova-compute[70954]: kvm64 Apr 21 10:43:05 user nova-compute[70954]: kvm32 Apr 21 10:43:05 user nova-compute[70954]: coreduo Apr 21 10:43:05 user nova-compute[70954]: core2duo Apr 21 10:43:05 user nova-compute[70954]: athlon Apr 21 10:43:05 user nova-compute[70954]: Westmere-IBRS Apr 21 10:43:05 user nova-compute[70954]: Westmere Apr 21 10:43:05 user nova-compute[70954]: Snowridge Apr 21 10:43:05 user nova-compute[70954]: Skylake-Server-noTSX-IBRS Apr 21 10:43:05 user nova-compute[70954]: Skylake-Server-IBRS Apr 21 10:43:05 user nova-compute[70954]: Skylake-Server Apr 21 10:43:05 user nova-compute[70954]: Skylake-Client-noTSX-IBRS Apr 21 10:43:05 user nova-compute[70954]: Skylake-Client-IBRS Apr 21 10:43:05 user nova-compute[70954]: Skylake-Client Apr 21 10:43:05 user nova-compute[70954]: SandyBridge-IBRS Apr 21 10:43:05 user nova-compute[70954]: SandyBridge Apr 21 10:43:05 user nova-compute[70954]: Penryn Apr 21 10:43:05 user nova-compute[70954]: Opteron_G5 Apr 21 10:43:05 user nova-compute[70954]: Opteron_G4 Apr 21 10:43:05 user nova-compute[70954]: Opteron_G3 Apr 21 10:43:05 user nova-compute[70954]: Opteron_G2 Apr 21 10:43:05 user nova-compute[70954]: Opteron_G1 Apr 21 10:43:05 user nova-compute[70954]: Nehalem-IBRS Apr 21 10:43:05 user nova-compute[70954]: Nehalem Apr 21 10:43:05 user nova-compute[70954]: IvyBridge-IBRS Apr 21 10:43:05 user nova-compute[70954]: IvyBridge Apr 21 10:43:05 user nova-compute[70954]: Icelake-Server-noTSX Apr 21 10:43:05 user nova-compute[70954]: Icelake-Server Apr 21 10:43:05 user nova-compute[70954]: Icelake-Client-noTSX Apr 21 10:43:05 user nova-compute[70954]: Icelake-Client Apr 21 10:43:05 user nova-compute[70954]: Haswell-noTSX-IBRS Apr 21 10:43:05 user nova-compute[70954]: Haswell-noTSX Apr 21 10:43:05 user nova-compute[70954]: Haswell-IBRS Apr 21 10:43:05 user nova-compute[70954]: Haswell Apr 21 10:43:05 user nova-compute[70954]: EPYC-Rome Apr 21 10:43:05 user nova-compute[70954]: EPYC-Milan Apr 21 10:43:05 user nova-compute[70954]: EPYC-IBPB Apr 21 10:43:05 user nova-compute[70954]: EPYC Apr 21 10:43:05 user nova-compute[70954]: Dhyana Apr 21 10:43:05 user nova-compute[70954]: Cooperlake Apr 21 10:43:05 user nova-compute[70954]: Conroe Apr 21 10:43:05 user nova-compute[70954]: Cascadelake-Server-noTSX Apr 21 10:43:05 user nova-compute[70954]: Cascadelake-Server Apr 21 10:43:05 user nova-compute[70954]: Broadwell-noTSX-IBRS Apr 21 10:43:05 user nova-compute[70954]: Broadwell-noTSX Apr 21 10:43:05 user nova-compute[70954]: Broadwell-IBRS Apr 21 10:43:05 user nova-compute[70954]: Broadwell Apr 21 10:43:05 user nova-compute[70954]: 486 Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: file Apr 21 10:43:05 user nova-compute[70954]: anonymous Apr 21 10:43:05 user nova-compute[70954]: memfd Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: disk Apr 21 10:43:05 user nova-compute[70954]: cdrom Apr 21 10:43:05 user nova-compute[70954]: floppy Apr 21 10:43:05 user nova-compute[70954]: lun Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: ide Apr 21 10:43:05 user nova-compute[70954]: fdc Apr 21 10:43:05 user nova-compute[70954]: scsi Apr 21 10:43:05 user nova-compute[70954]: virtio Apr 21 10:43:05 user nova-compute[70954]: usb Apr 21 10:43:05 user nova-compute[70954]: sata Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: virtio Apr 21 10:43:05 user nova-compute[70954]: virtio-transitional Apr 21 10:43:05 user nova-compute[70954]: virtio-non-transitional Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: sdl Apr 21 10:43:05 user nova-compute[70954]: vnc Apr 21 10:43:05 user nova-compute[70954]: spice Apr 21 10:43:05 user nova-compute[70954]: egl-headless Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: subsystem Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: default Apr 21 10:43:05 user nova-compute[70954]: mandatory Apr 21 10:43:05 user nova-compute[70954]: requisite Apr 21 10:43:05 user nova-compute[70954]: optional Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: usb Apr 21 10:43:05 user nova-compute[70954]: pci Apr 21 10:43:05 user nova-compute[70954]: scsi Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: virtio Apr 21 10:43:05 user nova-compute[70954]: virtio-transitional Apr 21 10:43:05 user nova-compute[70954]: virtio-non-transitional Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: random Apr 21 10:43:05 user nova-compute[70954]: egd Apr 21 10:43:05 user nova-compute[70954]: builtin Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: path Apr 21 10:43:05 user nova-compute[70954]: handle Apr 21 10:43:05 user nova-compute[70954]: virtiofs Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: tpm-tis Apr 21 10:43:05 user nova-compute[70954]: tpm-crb Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: passthrough Apr 21 10:43:05 user nova-compute[70954]: emulator Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: {{(pid=70954) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 21 10:43:05 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Getting domain capabilities for xtensa via machine types: {None} {{(pid=70954) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 10:43:05 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Error from libvirt when retrieving domain capabilities for arch xtensa / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-xtensa' on this host {{(pid=70954) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 10:43:05 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Getting domain capabilities for xtensaeb via machine types: {None} {{(pid=70954) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 10:43:05 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Error from libvirt when retrieving domain capabilities for arch xtensaeb / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-xtensaeb' on this host {{(pid=70954) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 10:43:05 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Checking secure boot support for host arch (x86_64) {{(pid=70954) supports_secure_boot /opt/stack/nova/nova/virt/libvirt/host.py:1750}} Apr 21 10:43:05 user nova-compute[70954]: INFO nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Secure Boot support detected Apr 21 10:43:05 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] cpu compare xml: Apr 21 10:43:05 user nova-compute[70954]: Nehalem Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: {{(pid=70954) _compare_cpu /opt/stack/nova/nova/virt/libvirt/driver.py:9996}} Apr 21 10:43:05 user nova-compute[70954]: INFO nova.virt.node [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Generated node identity f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 Apr 21 10:43:05 user nova-compute[70954]: INFO nova.virt.node [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Wrote node identity f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 to /opt/stack/data/nova/compute_id Apr 21 10:43:05 user nova-compute[70954]: WARNING nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Compute nodes ['f5a93adf-7a38-4ac6-ba5b-d6a75e692e97'] for host user were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. Apr 21 10:43:05 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host Apr 21 10:43:05 user nova-compute[70954]: WARNING nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] No compute node record found for host user. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host user could not be found. Apr 21 10:43:05 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:43:05 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:43:05 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:43:05 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Auditing locally available compute resources for user (node: user) {{(pid=70954) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 10:43:05 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:43:05 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:43:05 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Hypervisor/Node resource view: name=user free_ram=10815MB free_disk=27.052963256835938GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70954) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 10:43:05 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:43:05 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:43:05 user nova-compute[70954]: WARNING nova.compute.resource_tracker [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] No compute node record for user:f5a93adf-7a38-4ac6-ba5b-d6a75e692e97: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 could not be found. Apr 21 10:43:05 user nova-compute[70954]: INFO nova.compute.resource_tracker [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Compute node record created for user:user with uuid: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 Apr 21 10:43:05 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 10:43:05 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 10:43:05 user nova-compute[70954]: INFO nova.scheduler.client.report [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [req-6ce6a9ab-acbf-492c-ba7d-ebac77673158] Created resource provider record via placement API for resource provider with UUID f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 and name user. Apr 21 10:43:05 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] /sys/module/kvm_amd/parameters/sev does not exist {{(pid=70954) _kernel_supports_amd_sev /opt/stack/nova/nova/virt/libvirt/host.py:1766}} Apr 21 10:43:05 user nova-compute[70954]: INFO nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] kernel doesn't support AMD SEV Apr 21 10:43:05 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Updating inventory in ProviderTree for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 with inventory: {'MEMORY_MB': {'total': 16023, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 12, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 40, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 0}} {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 21 10:43:05 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70954) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 10:43:05 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Libvirt baseline CPU Apr 21 10:43:05 user nova-compute[70954]: x86_64 Apr 21 10:43:05 user nova-compute[70954]: Nehalem Apr 21 10:43:05 user nova-compute[70954]: Intel Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: Apr 21 10:43:05 user nova-compute[70954]: {{(pid=70954) _get_guest_baseline_cpu_features /opt/stack/nova/nova/virt/libvirt/driver.py:12486}} Apr 21 10:43:05 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Updated inventory for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 16023, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 12, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 40, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} Apr 21 10:43:05 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Updating resource provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 generation from 0 to 1 during operation: update_inventory {{(pid=70954) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} Apr 21 10:43:05 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Updating inventory in ProviderTree for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 with inventory: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 21 10:43:06 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Updating resource provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 generation from 1 to 2 during operation: update_traits {{(pid=70954) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} Apr 21 10:43:06 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Compute_service record updated for user:user {{(pid=70954) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 10:43:06 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.660s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:43:06 user nova-compute[70954]: DEBUG nova.service [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Creating RPC server for service compute {{(pid=70954) start /opt/stack/nova/nova/service.py:182}} Apr 21 10:43:06 user nova-compute[70954]: DEBUG nova.service [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Join ServiceGroup membership for this service compute {{(pid=70954) start /opt/stack/nova/nova/service.py:199}} Apr 21 10:43:06 user nova-compute[70954]: DEBUG nova.servicegroup.drivers.db [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] DB_Driver: join new ServiceGroup member user to the compute group, service = {{(pid=70954) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} Apr 21 10:43:58 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:43:58 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:43:58 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Starting heal instance info cache {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 10:43:58 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Rebuilding the list of instances to heal {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 21 10:43:58 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Didn't find any instances for network info cache update. {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 21 10:43:58 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:43:58 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:43:58 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:43:58 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:43:58 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:43:58 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._sync_power_states {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:43:58 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:43:58 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70954) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 10:43:58 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager.update_available_resource {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:43:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:43:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:43:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:43:58 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Auditing locally available compute resources for user (node: user) {{(pid=70954) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 10:43:58 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:43:58 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:43:58 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Hypervisor/Node resource view: name=user free_ram=10168MB free_disk=26.966777801513672GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70954) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 10:43:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:43:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:43:58 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 10:43:58 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 10:43:58 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:43:58 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:43:58 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Compute_service record updated for user:user {{(pid=70954) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 10:43:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.317s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:43:58 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:44:57 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:44:57 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:44:57 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager.update_available_resource {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:44:57 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:44:57 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:44:57 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:44:57 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Auditing locally available compute resources for user (node: user) {{(pid=70954) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 10:44:57 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:44:57 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:44:57 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Hypervisor/Node resource view: name=user free_ram=10176MB free_disk=27.01404571533203GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70954) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 10:44:57 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:44:57 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:44:57 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 10:44:57 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 10:44:57 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:44:57 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:44:58 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Compute_service record updated for user:user {{(pid=70954) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 10:44:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.149s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:44:58 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:44:58 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:44:58 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Starting heal instance info cache {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 10:44:58 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Rebuilding the list of instances to heal {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 21 10:44:58 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Didn't find any instances for network info cache update. {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 21 10:44:58 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:44:58 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:44:58 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:44:58 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70954) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 10:44:58 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:44:58 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:45:56 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager.update_available_resource {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:45:56 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:45:56 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:45:56 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:45:56 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Auditing locally available compute resources for user (node: user) {{(pid=70954) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 10:45:57 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:45:57 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:45:57 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Hypervisor/Node resource view: name=user free_ram=10153MB free_disk=26.790088653564453GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70954) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 10:45:57 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:45:57 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:45:57 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 10:45:57 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 10:45:57 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:45:57 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:45:57 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Compute_service record updated for user:user {{(pid=70954) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 10:45:57 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.145s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:45:58 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:45:58 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Starting heal instance info cache {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 10:45:58 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Rebuilding the list of instances to heal {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 21 10:45:58 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Didn't find any instances for network info cache update. {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 21 10:45:58 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:45:58 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:45:58 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:45:58 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:45:58 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:45:58 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:45:58 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70954) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 10:45:59 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:46:57 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:46:57 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:46:58 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:46:58 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:46:58 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Starting heal instance info cache {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 10:46:58 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Rebuilding the list of instances to heal {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 21 10:46:58 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Didn't find any instances for network info cache update. {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 21 10:46:58 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:46:58 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:46:58 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager.update_available_resource {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:46:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:46:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:46:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:46:58 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Auditing locally available compute resources for user (node: user) {{(pid=70954) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 10:46:59 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:46:59 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:46:59 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Hypervisor/Node resource view: name=user free_ram=9457MB free_disk=26.811843872070312GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70954) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 10:46:59 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:46:59 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:46:59 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 10:46:59 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 10:46:59 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:46:59 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:46:59 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Compute_service record updated for user:user {{(pid=70954) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 10:46:59 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.129s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:00 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:47:00 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:47:00 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70954) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 10:47:01 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:47:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Acquiring lock "84b55fc0-e748-4c05-97ad-a6994c0487d2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "84b55fc0-e748-4c05-97ad-a6994c0487d2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:02 user nova-compute[70954]: DEBUG nova.compute.manager [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Starting instance... {{(pid=70954) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 10:47:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:02 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70954) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 10:47:02 user nova-compute[70954]: INFO nova.compute.claims [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Claim successful on node user Apr 21 10:47:02 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:47:02 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:47:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.272s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:02 user nova-compute[70954]: DEBUG nova.compute.manager [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Start building networks asynchronously for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 10:47:02 user nova-compute[70954]: DEBUG nova.compute.manager [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Allocating IP information in the background. {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 10:47:02 user nova-compute[70954]: DEBUG nova.network.neutron [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] allocate_for_instance() {{(pid=70954) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 10:47:02 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 10:47:02 user nova-compute[70954]: DEBUG nova.compute.manager [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Start building block device mappings for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 10:47:02 user nova-compute[70954]: DEBUG nova.compute.manager [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Start spawning the instance on the hypervisor. {{(pid=70954) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 10:47:02 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Creating instance directory {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 10:47:02 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Creating image(s) Apr 21 10:47:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Acquiring lock "/opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "/opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "/opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Acquiring lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:03 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a.part --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:47:03 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a.part --force-share --output=json" returned: 0 in 0.126s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:47:03 user nova-compute[70954]: DEBUG nova.virt.images [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] 3b29a01a-1fc0-4d0d-89fb-23d22b2de02e was qcow2, converting to raw {{(pid=70954) fetch_to_raw /opt/stack/nova/nova/virt/images.py:165}} Apr 21 10:47:03 user nova-compute[70954]: DEBUG nova.privsep.utils [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=70954) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 21 10:47:03 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a.part /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a.converted {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:47:03 user nova-compute[70954]: DEBUG nova.policy [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '54c67d90b6014d9ea24ef2552006bc04', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'aad84a0e014f47ddaeaddc88bf16b0a8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70954) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 10:47:03 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a.part /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a.converted" returned: 0 in 0.207s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:47:03 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a.converted --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:47:03 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a.converted --force-share --output=json" returned: 0 in 0.124s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:47:03 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.889s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:03 user nova-compute[70954]: INFO oslo.privsep.daemon [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova-cpu.conf', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpv2l1k00s/privsep.sock'] Apr 21 10:47:03 user sudo[79660]: stack : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/nova-rootwrap /etc/nova/rootwrap.conf privsep-helper --config-file /etc/nova/nova-cpu.conf --privsep_context nova.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpv2l1k00s/privsep.sock Apr 21 10:47:03 user sudo[79660]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1001) Apr 21 10:47:05 user sudo[79660]: pam_unix(sudo:session): session closed for user root Apr 21 10:47:05 user nova-compute[70954]: INFO oslo.privsep.daemon [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Spawned new privsep daemon via rootwrap Apr 21 10:47:05 user nova-compute[70954]: INFO oslo.privsep.daemon [-] privsep daemon starting Apr 21 10:47:05 user nova-compute[70954]: INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0 Apr 21 10:47:05 user nova-compute[70954]: INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none Apr 21 10:47:05 user nova-compute[70954]: INFO oslo.privsep.daemon [-] privsep daemon running as pid 79663 Apr 21 10:47:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:47:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.134s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:47:05 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Acquiring lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:05 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:47:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.149s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:47:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk 1073741824 {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:47:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk 1073741824" returned: 0 in 0.046s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:47:05 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.199s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:47:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.134s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:47:05 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Checking if we can resize image /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk. size=1073741824 {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 10:47:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:47:06 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:47:06 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Cannot resize image /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk to a smaller size. {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 10:47:06 user nova-compute[70954]: DEBUG nova.objects.instance [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lazy-loading 'migration_context' on Instance uuid 84b55fc0-e748-4c05-97ad-a6994c0487d2 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:47:06 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Created local disks {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 10:47:06 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Ensure instance console log exists: /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/console.log {{(pid=70954) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 10:47:06 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:06 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:06 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:07 user nova-compute[70954]: DEBUG nova.network.neutron [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Successfully created port: 2a49817a-aed1-49bd-96b6-36286ff71e1c {{(pid=70954) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 10:47:12 user nova-compute[70954]: DEBUG nova.network.neutron [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Successfully updated port: 2a49817a-aed1-49bd-96b6-36286ff71e1c {{(pid=70954) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 10:47:13 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Acquiring lock "refresh_cache-84b55fc0-e748-4c05-97ad-a6994c0487d2" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:47:13 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Acquired lock "refresh_cache-84b55fc0-e748-4c05-97ad-a6994c0487d2" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:47:13 user nova-compute[70954]: DEBUG nova.network.neutron [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Building network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 10:47:13 user nova-compute[70954]: DEBUG nova.network.neutron [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Instance cache missing network info. {{(pid=70954) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 10:47:13 user nova-compute[70954]: DEBUG nova.compute.manager [req-1d3f7fcc-db1d-4b92-b70e-8a36dd446b9b req-59208acd-7932-46c7-a783-ac501b0d8657 service nova] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Received event network-changed-2a49817a-aed1-49bd-96b6-36286ff71e1c {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:47:13 user nova-compute[70954]: DEBUG nova.compute.manager [req-1d3f7fcc-db1d-4b92-b70e-8a36dd446b9b req-59208acd-7932-46c7-a783-ac501b0d8657 service nova] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Refreshing instance network info cache due to event network-changed-2a49817a-aed1-49bd-96b6-36286ff71e1c. {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 10:47:13 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-1d3f7fcc-db1d-4b92-b70e-8a36dd446b9b req-59208acd-7932-46c7-a783-ac501b0d8657 service nova] Acquiring lock "refresh_cache-84b55fc0-e748-4c05-97ad-a6994c0487d2" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Acquiring lock "dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Lock "dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG nova.network.neutron [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Updating instance_info_cache with network_info: [{"id": "2a49817a-aed1-49bd-96b6-36286ff71e1c", "address": "fa:16:3e:4f:2d:82", "network": {"id": "cfb4de90-44ea-486a-b5c4-c3b1111aa2bd", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1667019531-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "aad84a0e014f47ddaeaddc88bf16b0a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a49817a-ae", "ovs_interfaceid": "2a49817a-aed1-49bd-96b6-36286ff71e1c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG nova.compute.manager [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Starting instance... {{(pid=70954) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Releasing lock "refresh_cache-84b55fc0-e748-4c05-97ad-a6994c0487d2" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG nova.compute.manager [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Instance network_info: |[{"id": "2a49817a-aed1-49bd-96b6-36286ff71e1c", "address": "fa:16:3e:4f:2d:82", "network": {"id": "cfb4de90-44ea-486a-b5c4-c3b1111aa2bd", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1667019531-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "aad84a0e014f47ddaeaddc88bf16b0a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a49817a-ae", "ovs_interfaceid": "2a49817a-aed1-49bd-96b6-36286ff71e1c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-1d3f7fcc-db1d-4b92-b70e-8a36dd446b9b req-59208acd-7932-46c7-a783-ac501b0d8657 service nova] Acquired lock "refresh_cache-84b55fc0-e748-4c05-97ad-a6994c0487d2" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG nova.network.neutron [req-1d3f7fcc-db1d-4b92-b70e-8a36dd446b9b req-59208acd-7932-46c7-a783-ac501b0d8657 service nova] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Refreshing network info cache for port 2a49817a-aed1-49bd-96b6-36286ff71e1c {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Start _get_guest_xml network_info=[{"id": "2a49817a-aed1-49bd-96b6-36286ff71e1c", "address": "fa:16:3e:4f:2d:82", "network": {"id": "cfb4de90-44ea-486a-b5c4-c3b1111aa2bd", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1667019531-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "aad84a0e014f47ddaeaddc88bf16b0a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a49817a-ae", "ovs_interfaceid": "2a49817a-aed1-49bd-96b6-36286ff71e1c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:43:25Z,direct_url=,disk_format='qcow2',id=3b29a01a-1fc0-4d0d-89fb-23d22b2de02e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='a3109aa78f014d0da3638064a889676d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:43:26Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'size': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'disk_bus': 'virtio', 'encryption_format': None, 'guest_format': None, 'image_id': '3b29a01a-1fc0-4d0d-89fb-23d22b2de02e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 10:47:14 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:47:14 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:47:14 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70954) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T10:44:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:43:25Z,direct_url=,disk_format='qcow2',id=3b29a01a-1fc0-4d0d-89fb-23d22b2de02e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='a3109aa78f014d0da3638064a889676d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:43:26Z,virtual_size=,visibility=), allow threads: True {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Flavor limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Image limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Flavor pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Image pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Got 1 possible topologies {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG nova.privsep.utils [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=70954) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:47:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-76757344',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-76757344',id=1,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='aad84a0e014f47ddaeaddc88bf16b0a8',ramdisk_id='',reservation_id='r-zj852vb0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-1980957418',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:47:03Z,user_data=None,user_id='54c67d90b6014d9ea24ef2552006bc04',uuid=84b55fc0-e748-4c05-97ad-a6994c0487d2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2a49817a-aed1-49bd-96b6-36286ff71e1c", "address": "fa:16:3e:4f:2d:82", "network": {"id": "cfb4de90-44ea-486a-b5c4-c3b1111aa2bd", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1667019531-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "aad84a0e014f47ddaeaddc88bf16b0a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a49817a-ae", "ovs_interfaceid": "2a49817a-aed1-49bd-96b6-36286ff71e1c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70954) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Converting VIF {"id": "2a49817a-aed1-49bd-96b6-36286ff71e1c", "address": "fa:16:3e:4f:2d:82", "network": {"id": "cfb4de90-44ea-486a-b5c4-c3b1111aa2bd", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1667019531-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "aad84a0e014f47ddaeaddc88bf16b0a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a49817a-ae", "ovs_interfaceid": "2a49817a-aed1-49bd-96b6-36286ff71e1c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4f:2d:82,bridge_name='br-int',has_traffic_filtering=True,id=2a49817a-aed1-49bd-96b6-36286ff71e1c,network=Network(cfb4de90-44ea-486a-b5c4-c3b1111aa2bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a49817a-ae') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG nova.objects.instance [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lazy-loading 'pci_devices' on Instance uuid 84b55fc0-e748-4c05-97ad-a6994c0487d2 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] End _get_guest_xml xml= Apr 21 10:47:14 user nova-compute[70954]: 84b55fc0-e748-4c05-97ad-a6994c0487d2 Apr 21 10:47:14 user nova-compute[70954]: instance-00000001 Apr 21 10:47:14 user nova-compute[70954]: 131072 Apr 21 10:47:14 user nova-compute[70954]: 1 Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: tempest-ServerBootFromVolumeStableRescueTest-server-76757344 Apr 21 10:47:14 user nova-compute[70954]: 2023-04-21 10:47:14 Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: 128 Apr 21 10:47:14 user nova-compute[70954]: 1 Apr 21 10:47:14 user nova-compute[70954]: 0 Apr 21 10:47:14 user nova-compute[70954]: 0 Apr 21 10:47:14 user nova-compute[70954]: 1 Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member Apr 21 10:47:14 user nova-compute[70954]: tempest-ServerBootFromVolumeStableRescueTest-1980957418 Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: OpenStack Foundation Apr 21 10:47:14 user nova-compute[70954]: OpenStack Nova Apr 21 10:47:14 user nova-compute[70954]: 0.0.0 Apr 21 10:47:14 user nova-compute[70954]: 84b55fc0-e748-4c05-97ad-a6994c0487d2 Apr 21 10:47:14 user nova-compute[70954]: 84b55fc0-e748-4c05-97ad-a6994c0487d2 Apr 21 10:47:14 user nova-compute[70954]: Virtual Machine Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: hvm Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Nehalem Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: /dev/urandom Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: Apr 21 10:47:14 user nova-compute[70954]: {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:47:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-76757344',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-76757344',id=1,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='aad84a0e014f47ddaeaddc88bf16b0a8',ramdisk_id='',reservation_id='r-zj852vb0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-1980957418',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:47:03Z,user_data=None,user_id='54c67d90b6014d9ea24ef2552006bc04',uuid=84b55fc0-e748-4c05-97ad-a6994c0487d2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2a49817a-aed1-49bd-96b6-36286ff71e1c", "address": "fa:16:3e:4f:2d:82", "network": {"id": "cfb4de90-44ea-486a-b5c4-c3b1111aa2bd", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1667019531-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "aad84a0e014f47ddaeaddc88bf16b0a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a49817a-ae", "ovs_interfaceid": "2a49817a-aed1-49bd-96b6-36286ff71e1c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Converting VIF {"id": "2a49817a-aed1-49bd-96b6-36286ff71e1c", "address": "fa:16:3e:4f:2d:82", "network": {"id": "cfb4de90-44ea-486a-b5c4-c3b1111aa2bd", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1667019531-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "aad84a0e014f47ddaeaddc88bf16b0a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a49817a-ae", "ovs_interfaceid": "2a49817a-aed1-49bd-96b6-36286ff71e1c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4f:2d:82,bridge_name='br-int',has_traffic_filtering=True,id=2a49817a-aed1-49bd-96b6-36286ff71e1c,network=Network(cfb4de90-44ea-486a-b5c4-c3b1111aa2bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a49817a-ae') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG os_vif [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:2d:82,bridge_name='br-int',has_traffic_filtering=True,id=2a49817a-aed1-49bd-96b6-36286ff71e1c,network=Network(cfb4de90-44ea-486a-b5c4-c3b1111aa2bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a49817a-ae') {{(pid=70954) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Created schema index Interface.name {{(pid=70954) autocreate_indices /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/__init__.py:106}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Created schema index Port.name {{(pid=70954) autocreate_indices /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/__init__.py:106}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Created schema index Bridge.name {{(pid=70954) autocreate_indices /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/__init__.py:106}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] tcp:127.0.0.1:6640: entering CONNECTING {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [POLLOUT] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70954) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 10:47:14 user nova-compute[70954]: INFO nova.compute.claims [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Claim successful on node user Apr 21 10:47:14 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 10:47:14 user nova-compute[70954]: INFO oslo.privsep.daemon [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova-cpu.conf', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpwaulg1lo/privsep.sock'] Apr 21 10:47:14 user sudo[79694]: stack : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/nova-rootwrap /etc/nova/rootwrap.conf privsep-helper --config-file /etc/nova/nova-cpu.conf --privsep_context vif_plug_ovs.privsep.vif_plug --privsep_sock_path /tmp/tmpwaulg1lo/privsep.sock Apr 21 10:47:14 user sudo[79694]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1001) Apr 21 10:47:14 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.320s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG nova.compute.manager [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Start building networks asynchronously for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG nova.compute.manager [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Allocating IP information in the background. {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG nova.network.neutron [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] allocate_for_instance() {{(pid=70954) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 10:47:14 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 10:47:14 user nova-compute[70954]: DEBUG nova.compute.manager [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Start building block device mappings for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG nova.compute.manager [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Start spawning the instance on the hypervisor. {{(pid=70954) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Creating instance directory {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 10:47:14 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Creating image(s) Apr 21 10:47:14 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Acquiring lock "/opt/stack/data/nova/instances/dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Lock "/opt/stack/data/nova/instances/dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Lock "/opt/stack/data/nova/instances/dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:14 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:47:15 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.147s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:47:15 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Acquiring lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:15 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:15 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:47:15 user nova-compute[70954]: DEBUG nova.policy [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'eb7625e4107240d5a92379ace66052fa', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f12ec80f50254e5bbc5afd5470546c71', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70954) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 10:47:15 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.142s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:47:15 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8/disk 1073741824 {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:47:15 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8/disk 1073741824" returned: 0 in 0.057s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:47:15 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.205s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:15 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:47:15 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.175s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:47:15 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Checking if we can resize image /opt/stack/data/nova/instances/dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8/disk. size=1073741824 {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 10:47:15 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:47:15 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8/disk --force-share --output=json" returned: 0 in 0.153s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:47:15 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Cannot resize image /opt/stack/data/nova/instances/dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8/disk to a smaller size. {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 10:47:15 user nova-compute[70954]: DEBUG nova.objects.instance [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Lazy-loading 'migration_context' on Instance uuid dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:47:15 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Created local disks {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 10:47:15 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Ensure instance console log exists: /opt/stack/data/nova/instances/dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8/console.log {{(pid=70954) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 10:47:15 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:15 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:15 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:16 user sudo[79694]: pam_unix(sudo:session): session closed for user root Apr 21 10:47:16 user nova-compute[70954]: INFO oslo.privsep.daemon [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Spawned new privsep daemon via rootwrap Apr 21 10:47:16 user nova-compute[70954]: INFO oslo.privsep.daemon [-] privsep daemon starting Apr 21 10:47:16 user nova-compute[70954]: INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0 Apr 21 10:47:16 user nova-compute[70954]: INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none Apr 21 10:47:16 user nova-compute[70954]: INFO oslo.privsep.daemon [-] privsep daemon running as pid 79720 Apr 21 10:47:16 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Acquiring lock "f8609da3-c26d-482a-bc03-017baf4bce22" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:16 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Lock "f8609da3-c26d-482a-bc03-017baf4bce22" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:16 user nova-compute[70954]: DEBUG nova.compute.manager [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Starting instance... {{(pid=70954) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 10:47:16 user nova-compute[70954]: DEBUG nova.network.neutron [req-1d3f7fcc-db1d-4b92-b70e-8a36dd446b9b req-59208acd-7932-46c7-a783-ac501b0d8657 service nova] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Updated VIF entry in instance network info cache for port 2a49817a-aed1-49bd-96b6-36286ff71e1c. {{(pid=70954) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 10:47:16 user nova-compute[70954]: DEBUG nova.network.neutron [req-1d3f7fcc-db1d-4b92-b70e-8a36dd446b9b req-59208acd-7932-46c7-a783-ac501b0d8657 service nova] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Updating instance_info_cache with network_info: [{"id": "2a49817a-aed1-49bd-96b6-36286ff71e1c", "address": "fa:16:3e:4f:2d:82", "network": {"id": "cfb4de90-44ea-486a-b5c4-c3b1111aa2bd", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1667019531-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "aad84a0e014f47ddaeaddc88bf16b0a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a49817a-ae", "ovs_interfaceid": "2a49817a-aed1-49bd-96b6-36286ff71e1c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:47:16 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-1d3f7fcc-db1d-4b92-b70e-8a36dd446b9b req-59208acd-7932-46c7-a783-ac501b0d8657 service nova] Releasing lock "refresh_cache-84b55fc0-e748-4c05-97ad-a6994c0487d2" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:47:16 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:16 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:16 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70954) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 10:47:16 user nova-compute[70954]: INFO nova.compute.claims [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Claim successful on node user Apr 21 10:47:16 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:16 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:47:16 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:47:16 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:16 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2a49817a-ae, may_exist=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:47:16 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2a49817a-ae, col_values=(('external_ids', {'iface-id': '2a49817a-aed1-49bd-96b6-36286ff71e1c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4f:2d:82', 'vm-uuid': '84b55fc0-e748-4c05-97ad-a6994c0487d2'}),)) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:47:16 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:16 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:47:16 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:16 user nova-compute[70954]: INFO os_vif [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:2d:82,bridge_name='br-int',has_traffic_filtering=True,id=2a49817a-aed1-49bd-96b6-36286ff71e1c,network=Network(cfb4de90-44ea-486a-b5c4-c3b1111aa2bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a49817a-ae') Apr 21 10:47:16 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.315s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:16 user nova-compute[70954]: DEBUG nova.compute.manager [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Start building networks asynchronously for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 10:47:16 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] No BDM found with device name vda, not building metadata. {{(pid=70954) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 10:47:16 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] No VIF found with MAC fa:16:3e:4f:2d:82, not building metadata {{(pid=70954) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 10:47:16 user nova-compute[70954]: DEBUG nova.compute.manager [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Allocating IP information in the background. {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 10:47:16 user nova-compute[70954]: DEBUG nova.network.neutron [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] allocate_for_instance() {{(pid=70954) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 10:47:16 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 10:47:16 user nova-compute[70954]: DEBUG nova.compute.manager [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Start building block device mappings for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 10:47:16 user nova-compute[70954]: DEBUG nova.compute.manager [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Start spawning the instance on the hypervisor. {{(pid=70954) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 10:47:16 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Creating instance directory {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 10:47:16 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Creating image(s) Apr 21 10:47:16 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Acquiring lock "/opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:16 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Lock "/opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:16 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Lock "/opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:16 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:47:16 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.130s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:47:16 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Acquiring lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:16 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:16 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:47:17 user nova-compute[70954]: DEBUG nova.policy [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ced216baa4a64c72946cf3f71eb873dd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '648163a728fc4b28b85a24e9198d356b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70954) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 10:47:17 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.147s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:47:17 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk 1073741824 {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:47:17 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk 1073741824" returned: 0 in 0.041s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:47:17 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.193s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:17 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:47:17 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.159s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:47:17 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Checking if we can resize image /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk. size=1073741824 {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 10:47:17 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:47:17 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk --force-share --output=json" returned: 0 in 0.153s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:47:17 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Cannot resize image /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk to a smaller size. {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 10:47:17 user nova-compute[70954]: DEBUG nova.objects.instance [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Lazy-loading 'migration_context' on Instance uuid f8609da3-c26d-482a-bc03-017baf4bce22 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:47:17 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Created local disks {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 10:47:17 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Ensure instance console log exists: /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/console.log {{(pid=70954) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 10:47:17 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:17 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:17 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:17 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:17 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:17 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:17 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:17 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:17 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:18 user nova-compute[70954]: DEBUG nova.network.neutron [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Successfully created port: 781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3 {{(pid=70954) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 10:47:19 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Resumed> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:47:19 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] VM Resumed (Lifecycle Event) Apr 21 10:47:19 user nova-compute[70954]: DEBUG nova.compute.manager [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Instance event wait completed in 0 seconds for {{(pid=70954) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 10:47:19 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Guest created on hypervisor {{(pid=70954) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 10:47:19 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Instance spawned successfully. Apr 21 10:47:19 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 10:47:19 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Found default for hw_cdrom_bus of ide {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:47:19 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Found default for hw_disk_bus of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:47:19 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Found default for hw_input_bus of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:47:19 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Found default for hw_pointer_model of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:47:19 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Found default for hw_video_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:47:19 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Found default for hw_vif_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:47:19 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:47:19 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:47:19 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 10:47:19 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Started> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:47:19 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] VM Started (Lifecycle Event) Apr 21 10:47:19 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:47:19 user nova-compute[70954]: INFO nova.compute.manager [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Took 16.70 seconds to spawn the instance on the hypervisor. Apr 21 10:47:19 user nova-compute[70954]: DEBUG nova.compute.manager [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:47:19 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:47:19 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 10:47:19 user nova-compute[70954]: INFO nova.compute.manager [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Took 17.44 seconds to build instance. Apr 21 10:47:19 user nova-compute[70954]: DEBUG nova.compute.manager [req-3d747eb8-7a58-4371-a3ca-df095ec7a55c req-25b095fc-d3f0-43ce-ba7d-80c98d99dfe2 service nova] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Received event network-vif-plugged-2a49817a-aed1-49bd-96b6-36286ff71e1c {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:47:19 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-3d747eb8-7a58-4371-a3ca-df095ec7a55c req-25b095fc-d3f0-43ce-ba7d-80c98d99dfe2 service nova] Acquiring lock "84b55fc0-e748-4c05-97ad-a6994c0487d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:19 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-3d747eb8-7a58-4371-a3ca-df095ec7a55c req-25b095fc-d3f0-43ce-ba7d-80c98d99dfe2 service nova] Lock "84b55fc0-e748-4c05-97ad-a6994c0487d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:19 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-3d747eb8-7a58-4371-a3ca-df095ec7a55c req-25b095fc-d3f0-43ce-ba7d-80c98d99dfe2 service nova] Lock "84b55fc0-e748-4c05-97ad-a6994c0487d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:19 user nova-compute[70954]: DEBUG nova.compute.manager [req-3d747eb8-7a58-4371-a3ca-df095ec7a55c req-25b095fc-d3f0-43ce-ba7d-80c98d99dfe2 service nova] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] No waiting events found dispatching network-vif-plugged-2a49817a-aed1-49bd-96b6-36286ff71e1c {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:47:19 user nova-compute[70954]: WARNING nova.compute.manager [req-3d747eb8-7a58-4371-a3ca-df095ec7a55c req-25b095fc-d3f0-43ce-ba7d-80c98d99dfe2 service nova] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Received unexpected event network-vif-plugged-2a49817a-aed1-49bd-96b6-36286ff71e1c for instance with vm_state active and task_state None. Apr 21 10:47:19 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-5660a17b-c8be-435f-930b-b5424132b807 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "84b55fc0-e748-4c05-97ad-a6994c0487d2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 17.633s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:19 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Acquiring lock "15bf9321-a92e-4be2-bcae-a943988c811a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:19 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Lock "15bf9321-a92e-4be2-bcae-a943988c811a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:20 user nova-compute[70954]: DEBUG nova.compute.manager [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Starting instance... {{(pid=70954) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 10:47:20 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:20 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:20 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70954) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 10:47:20 user nova-compute[70954]: INFO nova.compute.claims [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Claim successful on node user Apr 21 10:47:20 user nova-compute[70954]: DEBUG nova.network.neutron [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Successfully created port: f210779b-302b-4a17-8b57-07837ea54e12 {{(pid=70954) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 10:47:20 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:47:20 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:47:20 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.328s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:20 user nova-compute[70954]: DEBUG nova.compute.manager [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Start building networks asynchronously for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 10:47:20 user nova-compute[70954]: DEBUG nova.compute.manager [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Allocating IP information in the background. {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 10:47:20 user nova-compute[70954]: DEBUG nova.network.neutron [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] allocate_for_instance() {{(pid=70954) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 10:47:20 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 10:47:20 user nova-compute[70954]: DEBUG nova.compute.manager [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Start building block device mappings for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 10:47:20 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Acquiring lock "dd4d15a1-3a71-49e8-9851-9b49fec6a9e3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:20 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Lock "dd4d15a1-3a71-49e8-9851-9b49fec6a9e3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:20 user nova-compute[70954]: DEBUG nova.compute.manager [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Starting instance... {{(pid=70954) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 10:47:20 user nova-compute[70954]: DEBUG nova.compute.manager [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Start spawning the instance on the hypervisor. {{(pid=70954) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 10:47:20 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Creating instance directory {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 10:47:20 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Creating image(s) Apr 21 10:47:20 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Acquiring lock "/opt/stack/data/nova/instances/15bf9321-a92e-4be2-bcae-a943988c811a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:20 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Lock "/opt/stack/data/nova/instances/15bf9321-a92e-4be2-bcae-a943988c811a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:20 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Lock "/opt/stack/data/nova/instances/15bf9321-a92e-4be2-bcae-a943988c811a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:20 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:47:20 user nova-compute[70954]: DEBUG nova.network.neutron [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Successfully updated port: 781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3 {{(pid=70954) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 10:47:20 user nova-compute[70954]: DEBUG nova.policy [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd172d648a9474db082646a47a2840214', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4bdd7a4ccfc340aa9c1b02c57f7a0e70', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70954) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 10:47:20 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Acquiring lock "refresh_cache-dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:47:20 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Acquired lock "refresh_cache-dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:47:20 user nova-compute[70954]: DEBUG nova.network.neutron [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Building network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 10:47:20 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:20 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:20 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70954) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 10:47:20 user nova-compute[70954]: INFO nova.compute.claims [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Claim successful on node user Apr 21 10:47:20 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.162s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:47:20 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Acquiring lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:20 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:20 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:47:21 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.241s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:47:21 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/15bf9321-a92e-4be2-bcae-a943988c811a/disk 1073741824 {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:47:21 user nova-compute[70954]: DEBUG nova.network.neutron [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Instance cache missing network info. {{(pid=70954) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 10:47:21 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/15bf9321-a92e-4be2-bcae-a943988c811a/disk 1073741824" returned: 0 in 0.070s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:47:21 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.316s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:21 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:47:21 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:47:21 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.163s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:47:21 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Checking if we can resize image /opt/stack/data/nova/instances/15bf9321-a92e-4be2-bcae-a943988c811a/disk. size=1073741824 {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 10:47:21 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15bf9321-a92e-4be2-bcae-a943988c811a/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:47:21 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:47:21 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.630s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:21 user nova-compute[70954]: DEBUG nova.compute.manager [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Start building networks asynchronously for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 10:47:21 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:21 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15bf9321-a92e-4be2-bcae-a943988c811a/disk --force-share --output=json" returned: 0 in 0.154s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:47:21 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Cannot resize image /opt/stack/data/nova/instances/15bf9321-a92e-4be2-bcae-a943988c811a/disk to a smaller size. {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 10:47:21 user nova-compute[70954]: DEBUG nova.objects.instance [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Lazy-loading 'migration_context' on Instance uuid 15bf9321-a92e-4be2-bcae-a943988c811a {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:47:21 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Created local disks {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 10:47:21 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Ensure instance console log exists: /opt/stack/data/nova/instances/15bf9321-a92e-4be2-bcae-a943988c811a/console.log {{(pid=70954) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 10:47:21 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:21 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:21 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:21 user nova-compute[70954]: DEBUG nova.compute.manager [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Allocating IP information in the background. {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 10:47:21 user nova-compute[70954]: DEBUG nova.network.neutron [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] allocate_for_instance() {{(pid=70954) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 10:47:21 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 10:47:21 user nova-compute[70954]: DEBUG nova.compute.manager [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Start building block device mappings for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 10:47:21 user nova-compute[70954]: DEBUG nova.compute.manager [req-78f7508e-82e2-477a-aec1-83b81219f7fe req-23ebb90c-411e-4596-8103-5003bb8a2ec8 service nova] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Received event network-vif-plugged-2a49817a-aed1-49bd-96b6-36286ff71e1c {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:47:21 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-78f7508e-82e2-477a-aec1-83b81219f7fe req-23ebb90c-411e-4596-8103-5003bb8a2ec8 service nova] Acquiring lock "84b55fc0-e748-4c05-97ad-a6994c0487d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:21 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-78f7508e-82e2-477a-aec1-83b81219f7fe req-23ebb90c-411e-4596-8103-5003bb8a2ec8 service nova] Lock "84b55fc0-e748-4c05-97ad-a6994c0487d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:21 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-78f7508e-82e2-477a-aec1-83b81219f7fe req-23ebb90c-411e-4596-8103-5003bb8a2ec8 service nova] Lock "84b55fc0-e748-4c05-97ad-a6994c0487d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:21 user nova-compute[70954]: DEBUG nova.compute.manager [req-78f7508e-82e2-477a-aec1-83b81219f7fe req-23ebb90c-411e-4596-8103-5003bb8a2ec8 service nova] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] No waiting events found dispatching network-vif-plugged-2a49817a-aed1-49bd-96b6-36286ff71e1c {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:47:21 user nova-compute[70954]: WARNING nova.compute.manager [req-78f7508e-82e2-477a-aec1-83b81219f7fe req-23ebb90c-411e-4596-8103-5003bb8a2ec8 service nova] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Received unexpected event network-vif-plugged-2a49817a-aed1-49bd-96b6-36286ff71e1c for instance with vm_state active and task_state None. Apr 21 10:47:21 user nova-compute[70954]: DEBUG nova.compute.manager [req-78f7508e-82e2-477a-aec1-83b81219f7fe req-23ebb90c-411e-4596-8103-5003bb8a2ec8 service nova] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Received event network-changed-781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:47:21 user nova-compute[70954]: DEBUG nova.compute.manager [req-78f7508e-82e2-477a-aec1-83b81219f7fe req-23ebb90c-411e-4596-8103-5003bb8a2ec8 service nova] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Refreshing instance network info cache due to event network-changed-781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3. {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 10:47:21 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-78f7508e-82e2-477a-aec1-83b81219f7fe req-23ebb90c-411e-4596-8103-5003bb8a2ec8 service nova] Acquiring lock "refresh_cache-dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:47:21 user nova-compute[70954]: DEBUG nova.compute.manager [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Start spawning the instance on the hypervisor. {{(pid=70954) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 10:47:21 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Creating instance directory {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 10:47:21 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Creating image(s) Apr 21 10:47:21 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Acquiring lock "/opt/stack/data/nova/instances/dd4d15a1-3a71-49e8-9851-9b49fec6a9e3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:21 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Lock "/opt/stack/data/nova/instances/dd4d15a1-3a71-49e8-9851-9b49fec6a9e3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:21 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Lock "/opt/stack/data/nova/instances/dd4d15a1-3a71-49e8-9851-9b49fec6a9e3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:21 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.164s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Acquiring lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG nova.policy [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '25fb0d890b594080bb1bb99dd6294ff1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd85f51547e5244e495343281725fe320', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70954) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.165s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/dd4d15a1-3a71-49e8-9851-9b49fec6a9e3/disk 1073741824 {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/dd4d15a1-3a71-49e8-9851-9b49fec6a9e3/disk 1073741824" returned: 0 in 0.054s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.224s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.131s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Checking if we can resize image /opt/stack/data/nova/instances/dd4d15a1-3a71-49e8-9851-9b49fec6a9e3/disk. size=1073741824 {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dd4d15a1-3a71-49e8-9851-9b49fec6a9e3/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG nova.network.neutron [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Updating instance_info_cache with network_info: [{"id": "781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3", "address": "fa:16:3e:77:18:56", "network": {"id": "4ef6bc58-0f6a-4b52-a251-5871f0f7d2d1", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-295901106-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f12ec80f50254e5bbc5afd5470546c71", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap781dee4b-a8", "ovs_interfaceid": "781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dd4d15a1-3a71-49e8-9851-9b49fec6a9e3/disk --force-share --output=json" returned: 0 in 0.155s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Cannot resize image /opt/stack/data/nova/instances/dd4d15a1-3a71-49e8-9851-9b49fec6a9e3/disk to a smaller size. {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG nova.objects.instance [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Lazy-loading 'migration_context' on Instance uuid dd4d15a1-3a71-49e8-9851-9b49fec6a9e3 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Created local disks {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Ensure instance console log exists: /opt/stack/data/nova/instances/dd4d15a1-3a71-49e8-9851-9b49fec6a9e3/console.log {{(pid=70954) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Releasing lock "refresh_cache-dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG nova.compute.manager [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Instance network_info: |[{"id": "781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3", "address": "fa:16:3e:77:18:56", "network": {"id": "4ef6bc58-0f6a-4b52-a251-5871f0f7d2d1", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-295901106-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f12ec80f50254e5bbc5afd5470546c71", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap781dee4b-a8", "ovs_interfaceid": "781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-78f7508e-82e2-477a-aec1-83b81219f7fe req-23ebb90c-411e-4596-8103-5003bb8a2ec8 service nova] Acquired lock "refresh_cache-dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG nova.network.neutron [req-78f7508e-82e2-477a-aec1-83b81219f7fe req-23ebb90c-411e-4596-8103-5003bb8a2ec8 service nova] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Refreshing network info cache for port 781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3 {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Start _get_guest_xml network_info=[{"id": "781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3", "address": "fa:16:3e:77:18:56", "network": {"id": "4ef6bc58-0f6a-4b52-a251-5871f0f7d2d1", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-295901106-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f12ec80f50254e5bbc5afd5470546c71", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap781dee4b-a8", "ovs_interfaceid": "781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:43:25Z,direct_url=,disk_format='qcow2',id=3b29a01a-1fc0-4d0d-89fb-23d22b2de02e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='a3109aa78f014d0da3638064a889676d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:43:26Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'size': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'disk_bus': 'virtio', 'encryption_format': None, 'guest_format': None, 'image_id': '3b29a01a-1fc0-4d0d-89fb-23d22b2de02e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 10:47:22 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:47:22 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:47:22 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70954) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T10:44:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:43:25Z,direct_url=,disk_format='qcow2',id=3b29a01a-1fc0-4d0d-89fb-23d22b2de02e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='a3109aa78f014d0da3638064a889676d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:43:26Z,virtual_size=,visibility=), allow threads: True {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Flavor limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Image limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Flavor pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Image pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Got 1 possible topologies {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:47:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-2137471025',display_name='tempest-DeleteServersTestJSON-server-2137471025',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-deleteserverstestjson-server-2137471025',id=2,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f12ec80f50254e5bbc5afd5470546c71',ramdisk_id='',reservation_id='r-pdoopfn2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1827381813',owner_user_name='tempest-DeleteServersTestJSON-1827381813-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:47:15Z,user_data=None,user_id='eb7625e4107240d5a92379ace66052fa',uuid=dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3", "address": "fa:16:3e:77:18:56", "network": {"id": "4ef6bc58-0f6a-4b52-a251-5871f0f7d2d1", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-295901106-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f12ec80f50254e5bbc5afd5470546c71", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap781dee4b-a8", "ovs_interfaceid": "781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70954) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Converting VIF {"id": "781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3", "address": "fa:16:3e:77:18:56", "network": {"id": "4ef6bc58-0f6a-4b52-a251-5871f0f7d2d1", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-295901106-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f12ec80f50254e5bbc5afd5470546c71", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap781dee4b-a8", "ovs_interfaceid": "781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:77:18:56,bridge_name='br-int',has_traffic_filtering=True,id=781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3,network=Network(4ef6bc58-0f6a-4b52-a251-5871f0f7d2d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap781dee4b-a8') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG nova.objects.instance [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Lazy-loading 'pci_devices' on Instance uuid dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] End _get_guest_xml xml= Apr 21 10:47:22 user nova-compute[70954]: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8 Apr 21 10:47:22 user nova-compute[70954]: instance-00000002 Apr 21 10:47:22 user nova-compute[70954]: 131072 Apr 21 10:47:22 user nova-compute[70954]: 1 Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: tempest-DeleteServersTestJSON-server-2137471025 Apr 21 10:47:22 user nova-compute[70954]: 2023-04-21 10:47:22 Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: 128 Apr 21 10:47:22 user nova-compute[70954]: 1 Apr 21 10:47:22 user nova-compute[70954]: 0 Apr 21 10:47:22 user nova-compute[70954]: 0 Apr 21 10:47:22 user nova-compute[70954]: 1 Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: tempest-DeleteServersTestJSON-1827381813-project-member Apr 21 10:47:22 user nova-compute[70954]: tempest-DeleteServersTestJSON-1827381813 Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: OpenStack Foundation Apr 21 10:47:22 user nova-compute[70954]: OpenStack Nova Apr 21 10:47:22 user nova-compute[70954]: 0.0.0 Apr 21 10:47:22 user nova-compute[70954]: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8 Apr 21 10:47:22 user nova-compute[70954]: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8 Apr 21 10:47:22 user nova-compute[70954]: Virtual Machine Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: hvm Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Nehalem Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: /dev/urandom Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: Apr 21 10:47:22 user nova-compute[70954]: {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:47:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-2137471025',display_name='tempest-DeleteServersTestJSON-server-2137471025',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-deleteserverstestjson-server-2137471025',id=2,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f12ec80f50254e5bbc5afd5470546c71',ramdisk_id='',reservation_id='r-pdoopfn2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1827381813',owner_user_name='tempest-DeleteServersTestJSON-1827381813-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:47:15Z,user_data=None,user_id='eb7625e4107240d5a92379ace66052fa',uuid=dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3", "address": "fa:16:3e:77:18:56", "network": {"id": "4ef6bc58-0f6a-4b52-a251-5871f0f7d2d1", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-295901106-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f12ec80f50254e5bbc5afd5470546c71", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap781dee4b-a8", "ovs_interfaceid": "781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Converting VIF {"id": "781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3", "address": "fa:16:3e:77:18:56", "network": {"id": "4ef6bc58-0f6a-4b52-a251-5871f0f7d2d1", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-295901106-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f12ec80f50254e5bbc5afd5470546c71", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap781dee4b-a8", "ovs_interfaceid": "781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:77:18:56,bridge_name='br-int',has_traffic_filtering=True,id=781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3,network=Network(4ef6bc58-0f6a-4b52-a251-5871f0f7d2d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap781dee4b-a8') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG os_vif [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:18:56,bridge_name='br-int',has_traffic_filtering=True,id=781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3,network=Network(4ef6bc58-0f6a-4b52-a251-5871f0f7d2d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap781dee4b-a8') {{(pid=70954) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap781dee4b-a8, may_exist=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap781dee4b-a8, col_values=(('external_ids', {'iface-id': '781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:77:18:56', 'vm-uuid': 'dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8'}),)) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:22 user nova-compute[70954]: INFO os_vif [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:18:56,bridge_name='br-int',has_traffic_filtering=True,id=781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3,network=Network(4ef6bc58-0f6a-4b52-a251-5871f0f7d2d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap781dee4b-a8') Apr 21 10:47:22 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] No BDM found with device name vda, not building metadata. {{(pid=70954) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] No VIF found with MAC fa:16:3e:77:18:56, not building metadata {{(pid=70954) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 10:47:22 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:23 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:23 user nova-compute[70954]: DEBUG nova.network.neutron [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Successfully updated port: f210779b-302b-4a17-8b57-07837ea54e12 {{(pid=70954) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 10:47:23 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Acquiring lock "refresh_cache-f8609da3-c26d-482a-bc03-017baf4bce22" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:47:23 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Acquired lock "refresh_cache-f8609da3-c26d-482a-bc03-017baf4bce22" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:47:23 user nova-compute[70954]: DEBUG nova.network.neutron [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Building network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 10:47:24 user nova-compute[70954]: DEBUG nova.compute.manager [req-04cb7474-2a77-42a0-9bf3-9fd67a447a7f req-5e254789-4b10-4ef5-a78f-ead30df9534a service nova] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Received event network-changed-f210779b-302b-4a17-8b57-07837ea54e12 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:47:24 user nova-compute[70954]: DEBUG nova.compute.manager [req-04cb7474-2a77-42a0-9bf3-9fd67a447a7f req-5e254789-4b10-4ef5-a78f-ead30df9534a service nova] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Refreshing instance network info cache due to event network-changed-f210779b-302b-4a17-8b57-07837ea54e12. {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 10:47:24 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-04cb7474-2a77-42a0-9bf3-9fd67a447a7f req-5e254789-4b10-4ef5-a78f-ead30df9534a service nova] Acquiring lock "refresh_cache-f8609da3-c26d-482a-bc03-017baf4bce22" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:47:24 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:24 user nova-compute[70954]: DEBUG nova.network.neutron [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Instance cache missing network info. {{(pid=70954) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 10:47:24 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:24 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:24 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:24 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:24 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:24 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:24 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:24 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG nova.network.neutron [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Successfully created port: fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d {{(pid=70954) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG nova.network.neutron [req-78f7508e-82e2-477a-aec1-83b81219f7fe req-23ebb90c-411e-4596-8103-5003bb8a2ec8 service nova] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Updated VIF entry in instance network info cache for port 781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3. {{(pid=70954) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG nova.network.neutron [req-78f7508e-82e2-477a-aec1-83b81219f7fe req-23ebb90c-411e-4596-8103-5003bb8a2ec8 service nova] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Updating instance_info_cache with network_info: [{"id": "781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3", "address": "fa:16:3e:77:18:56", "network": {"id": "4ef6bc58-0f6a-4b52-a251-5871f0f7d2d1", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-295901106-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f12ec80f50254e5bbc5afd5470546c71", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap781dee4b-a8", "ovs_interfaceid": "781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-78f7508e-82e2-477a-aec1-83b81219f7fe req-23ebb90c-411e-4596-8103-5003bb8a2ec8 service nova] Releasing lock "refresh_cache-dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG nova.compute.manager [req-34d9bf6e-4fe9-4f92-aa58-648774ca3946 req-5e83364c-8816-4d6d-90d8-413d735ad1a8 service nova] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Received event network-vif-plugged-781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-34d9bf6e-4fe9-4f92-aa58-648774ca3946 req-5e83364c-8816-4d6d-90d8-413d735ad1a8 service nova] Acquiring lock "dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-34d9bf6e-4fe9-4f92-aa58-648774ca3946 req-5e83364c-8816-4d6d-90d8-413d735ad1a8 service nova] Lock "dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-34d9bf6e-4fe9-4f92-aa58-648774ca3946 req-5e83364c-8816-4d6d-90d8-413d735ad1a8 service nova] Lock "dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG nova.compute.manager [req-34d9bf6e-4fe9-4f92-aa58-648774ca3946 req-5e83364c-8816-4d6d-90d8-413d735ad1a8 service nova] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] No waiting events found dispatching network-vif-plugged-781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:47:25 user nova-compute[70954]: WARNING nova.compute.manager [req-34d9bf6e-4fe9-4f92-aa58-648774ca3946 req-5e83364c-8816-4d6d-90d8-413d735ad1a8 service nova] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Received unexpected event network-vif-plugged-781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3 for instance with vm_state building and task_state spawning. Apr 21 10:47:25 user nova-compute[70954]: DEBUG nova.network.neutron [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Updating instance_info_cache with network_info: [{"id": "f210779b-302b-4a17-8b57-07837ea54e12", "address": "fa:16:3e:c3:c6:d1", "network": {"id": "ba8e9ff2-e562-462e-a2fa-0d7f643da26c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-83296950-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "648163a728fc4b28b85a24e9198d356b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf210779b-30", "ovs_interfaceid": "f210779b-302b-4a17-8b57-07837ea54e12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Releasing lock "refresh_cache-f8609da3-c26d-482a-bc03-017baf4bce22" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG nova.compute.manager [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Instance network_info: |[{"id": "f210779b-302b-4a17-8b57-07837ea54e12", "address": "fa:16:3e:c3:c6:d1", "network": {"id": "ba8e9ff2-e562-462e-a2fa-0d7f643da26c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-83296950-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "648163a728fc4b28b85a24e9198d356b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf210779b-30", "ovs_interfaceid": "f210779b-302b-4a17-8b57-07837ea54e12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-04cb7474-2a77-42a0-9bf3-9fd67a447a7f req-5e254789-4b10-4ef5-a78f-ead30df9534a service nova] Acquired lock "refresh_cache-f8609da3-c26d-482a-bc03-017baf4bce22" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG nova.network.neutron [req-04cb7474-2a77-42a0-9bf3-9fd67a447a7f req-5e254789-4b10-4ef5-a78f-ead30df9534a service nova] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Refreshing network info cache for port f210779b-302b-4a17-8b57-07837ea54e12 {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Start _get_guest_xml network_info=[{"id": "f210779b-302b-4a17-8b57-07837ea54e12", "address": "fa:16:3e:c3:c6:d1", "network": {"id": "ba8e9ff2-e562-462e-a2fa-0d7f643da26c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-83296950-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "648163a728fc4b28b85a24e9198d356b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf210779b-30", "ovs_interfaceid": "f210779b-302b-4a17-8b57-07837ea54e12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:43:25Z,direct_url=,disk_format='qcow2',id=3b29a01a-1fc0-4d0d-89fb-23d22b2de02e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='a3109aa78f014d0da3638064a889676d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:43:26Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'size': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'disk_bus': 'virtio', 'encryption_format': None, 'guest_format': None, 'image_id': '3b29a01a-1fc0-4d0d-89fb-23d22b2de02e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 10:47:25 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:47:25 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:47:25 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70954) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T10:44:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:43:25Z,direct_url=,disk_format='qcow2',id=3b29a01a-1fc0-4d0d-89fb-23d22b2de02e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='a3109aa78f014d0da3638064a889676d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:43:26Z,virtual_size=,visibility=), allow threads: True {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Flavor limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Image limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Flavor pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Image pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Got 1 possible topologies {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:47:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1307788712',display_name='tempest-ServerActionsTestJSON-server-1307788712',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serveractionstestjson-server-1307788712',id=3,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP17pDhSIQbCv4xewaSR+c65YmMH+hIkRmyXO1jHYq3hmftzXxLb6EXcvZayMHXJMHoDUOwUfoaQ/r3kME39pIqEI1cveoujwBV7i5jBCcTH71kCrlaE9KNWPqoT9mc/lQ==',key_name='tempest-keypair-1735824251',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='648163a728fc4b28b85a24e9198d356b',ramdisk_id='',reservation_id='r-0qt8u06d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1614287361',owner_user_name='tempest-ServerActionsTestJSON-1614287361-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:47:17Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ced216baa4a64c72946cf3f71eb873dd',uuid=f8609da3-c26d-482a-bc03-017baf4bce22,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f210779b-302b-4a17-8b57-07837ea54e12", "address": "fa:16:3e:c3:c6:d1", "network": {"id": "ba8e9ff2-e562-462e-a2fa-0d7f643da26c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-83296950-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "648163a728fc4b28b85a24e9198d356b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf210779b-30", "ovs_interfaceid": "f210779b-302b-4a17-8b57-07837ea54e12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70954) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Converting VIF {"id": "f210779b-302b-4a17-8b57-07837ea54e12", "address": "fa:16:3e:c3:c6:d1", "network": {"id": "ba8e9ff2-e562-462e-a2fa-0d7f643da26c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-83296950-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "648163a728fc4b28b85a24e9198d356b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf210779b-30", "ovs_interfaceid": "f210779b-302b-4a17-8b57-07837ea54e12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c3:c6:d1,bridge_name='br-int',has_traffic_filtering=True,id=f210779b-302b-4a17-8b57-07837ea54e12,network=Network(ba8e9ff2-e562-462e-a2fa-0d7f643da26c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf210779b-30') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG nova.objects.instance [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Lazy-loading 'pci_devices' on Instance uuid f8609da3-c26d-482a-bc03-017baf4bce22 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] End _get_guest_xml xml= Apr 21 10:47:25 user nova-compute[70954]: f8609da3-c26d-482a-bc03-017baf4bce22 Apr 21 10:47:25 user nova-compute[70954]: instance-00000003 Apr 21 10:47:25 user nova-compute[70954]: 131072 Apr 21 10:47:25 user nova-compute[70954]: 1 Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: tempest-ServerActionsTestJSON-server-1307788712 Apr 21 10:47:25 user nova-compute[70954]: 2023-04-21 10:47:25 Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: 128 Apr 21 10:47:25 user nova-compute[70954]: 1 Apr 21 10:47:25 user nova-compute[70954]: 0 Apr 21 10:47:25 user nova-compute[70954]: 0 Apr 21 10:47:25 user nova-compute[70954]: 1 Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: tempest-ServerActionsTestJSON-1614287361-project-member Apr 21 10:47:25 user nova-compute[70954]: tempest-ServerActionsTestJSON-1614287361 Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: OpenStack Foundation Apr 21 10:47:25 user nova-compute[70954]: OpenStack Nova Apr 21 10:47:25 user nova-compute[70954]: 0.0.0 Apr 21 10:47:25 user nova-compute[70954]: f8609da3-c26d-482a-bc03-017baf4bce22 Apr 21 10:47:25 user nova-compute[70954]: f8609da3-c26d-482a-bc03-017baf4bce22 Apr 21 10:47:25 user nova-compute[70954]: Virtual Machine Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: hvm Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Nehalem Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: /dev/urandom Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: Apr 21 10:47:25 user nova-compute[70954]: {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:47:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1307788712',display_name='tempest-ServerActionsTestJSON-server-1307788712',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serveractionstestjson-server-1307788712',id=3,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP17pDhSIQbCv4xewaSR+c65YmMH+hIkRmyXO1jHYq3hmftzXxLb6EXcvZayMHXJMHoDUOwUfoaQ/r3kME39pIqEI1cveoujwBV7i5jBCcTH71kCrlaE9KNWPqoT9mc/lQ==',key_name='tempest-keypair-1735824251',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='648163a728fc4b28b85a24e9198d356b',ramdisk_id='',reservation_id='r-0qt8u06d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1614287361',owner_user_name='tempest-ServerActionsTestJSON-1614287361-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:47:17Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ced216baa4a64c72946cf3f71eb873dd',uuid=f8609da3-c26d-482a-bc03-017baf4bce22,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f210779b-302b-4a17-8b57-07837ea54e12", "address": "fa:16:3e:c3:c6:d1", "network": {"id": "ba8e9ff2-e562-462e-a2fa-0d7f643da26c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-83296950-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "648163a728fc4b28b85a24e9198d356b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf210779b-30", "ovs_interfaceid": "f210779b-302b-4a17-8b57-07837ea54e12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Converting VIF {"id": "f210779b-302b-4a17-8b57-07837ea54e12", "address": "fa:16:3e:c3:c6:d1", "network": {"id": "ba8e9ff2-e562-462e-a2fa-0d7f643da26c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-83296950-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "648163a728fc4b28b85a24e9198d356b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf210779b-30", "ovs_interfaceid": "f210779b-302b-4a17-8b57-07837ea54e12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c3:c6:d1,bridge_name='br-int',has_traffic_filtering=True,id=f210779b-302b-4a17-8b57-07837ea54e12,network=Network(ba8e9ff2-e562-462e-a2fa-0d7f643da26c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf210779b-30') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG os_vif [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c3:c6:d1,bridge_name='br-int',has_traffic_filtering=True,id=f210779b-302b-4a17-8b57-07837ea54e12,network=Network(ba8e9ff2-e562-462e-a2fa-0d7f643da26c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf210779b-30') {{(pid=70954) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf210779b-30, may_exist=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf210779b-30, col_values=(('external_ids', {'iface-id': 'f210779b-302b-4a17-8b57-07837ea54e12', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c3:c6:d1', 'vm-uuid': 'f8609da3-c26d-482a-bc03-017baf4bce22'}),)) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:25 user nova-compute[70954]: INFO os_vif [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c3:c6:d1,bridge_name='br-int',has_traffic_filtering=True,id=f210779b-302b-4a17-8b57-07837ea54e12,network=Network(ba8e9ff2-e562-462e-a2fa-0d7f643da26c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf210779b-30') Apr 21 10:47:25 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] No BDM found with device name vda, not building metadata. {{(pid=70954) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 10:47:25 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] No VIF found with MAC fa:16:3e:c3:c6:d1, not building metadata {{(pid=70954) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 10:47:26 user nova-compute[70954]: DEBUG nova.network.neutron [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Successfully created port: b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c {{(pid=70954) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 10:47:26 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:26 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:26 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:26 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:26 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:26 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Resumed> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:47:26 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] VM Resumed (Lifecycle Event) Apr 21 10:47:26 user nova-compute[70954]: DEBUG nova.compute.manager [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Instance event wait completed in 0 seconds for {{(pid=70954) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 10:47:26 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Guest created on hypervisor {{(pid=70954) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 10:47:26 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Instance spawned successfully. Apr 21 10:47:26 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 10:47:26 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:47:26 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:47:26 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Found default for hw_cdrom_bus of ide {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:47:26 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Found default for hw_disk_bus of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:47:26 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Found default for hw_input_bus of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:47:26 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Found default for hw_pointer_model of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:47:26 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Found default for hw_video_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:47:26 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Found default for hw_vif_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:47:26 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 10:47:26 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Started> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:47:26 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] VM Started (Lifecycle Event) Apr 21 10:47:27 user nova-compute[70954]: INFO nova.compute.manager [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Took 12.12 seconds to spawn the instance on the hypervisor. Apr 21 10:47:27 user nova-compute[70954]: DEBUG nova.compute.manager [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:47:27 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:47:27 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:47:27 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 10:47:27 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Acquiring lock "aecf1ba8-9675-4535-874b-9084361b7693" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:27 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Lock "aecf1ba8-9675-4535-874b-9084361b7693" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:27 user nova-compute[70954]: INFO nova.compute.manager [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Took 12.95 seconds to build instance. Apr 21 10:47:27 user nova-compute[70954]: DEBUG nova.compute.manager [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Starting instance... {{(pid=70954) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 10:47:27 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ff195b2c-3d68-4076-94be-265ba32ac3d4 tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Lock "dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 13.124s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:27 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:27 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:27 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:27 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70954) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 10:47:27 user nova-compute[70954]: INFO nova.compute.claims [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Claim successful on node user Apr 21 10:47:27 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:27 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:27 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:27 user nova-compute[70954]: DEBUG nova.compute.manager [req-b1eb8dc9-aada-4665-9f06-a70807e2c84a req-c3a1da01-1115-467b-bd39-f9d8b27c10f0 service nova] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Received event network-vif-plugged-781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:47:27 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-b1eb8dc9-aada-4665-9f06-a70807e2c84a req-c3a1da01-1115-467b-bd39-f9d8b27c10f0 service nova] Acquiring lock "dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:27 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-b1eb8dc9-aada-4665-9f06-a70807e2c84a req-c3a1da01-1115-467b-bd39-f9d8b27c10f0 service nova] Lock "dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:27 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-b1eb8dc9-aada-4665-9f06-a70807e2c84a req-c3a1da01-1115-467b-bd39-f9d8b27c10f0 service nova] Lock "dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:27 user nova-compute[70954]: DEBUG nova.compute.manager [req-b1eb8dc9-aada-4665-9f06-a70807e2c84a req-c3a1da01-1115-467b-bd39-f9d8b27c10f0 service nova] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] No waiting events found dispatching network-vif-plugged-781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:47:27 user nova-compute[70954]: WARNING nova.compute.manager [req-b1eb8dc9-aada-4665-9f06-a70807e2c84a req-c3a1da01-1115-467b-bd39-f9d8b27c10f0 service nova] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Received unexpected event network-vif-plugged-781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3 for instance with vm_state active and task_state None. Apr 21 10:47:27 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:47:27 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:47:27 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.600s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:27 user nova-compute[70954]: DEBUG nova.compute.manager [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Start building networks asynchronously for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 10:47:28 user nova-compute[70954]: DEBUG nova.compute.manager [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Allocating IP information in the background. {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 10:47:28 user nova-compute[70954]: DEBUG nova.network.neutron [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] [instance: aecf1ba8-9675-4535-874b-9084361b7693] allocate_for_instance() {{(pid=70954) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 10:47:28 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Ignoring supplied device name: /dev/sda. Libvirt can't honour user-supplied dev names Apr 21 10:47:28 user nova-compute[70954]: DEBUG nova.compute.manager [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Start building block device mappings for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 10:47:28 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:28 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:28 user nova-compute[70954]: DEBUG nova.network.neutron [req-04cb7474-2a77-42a0-9bf3-9fd67a447a7f req-5e254789-4b10-4ef5-a78f-ead30df9534a service nova] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Updated VIF entry in instance network info cache for port f210779b-302b-4a17-8b57-07837ea54e12. {{(pid=70954) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 10:47:28 user nova-compute[70954]: DEBUG nova.network.neutron [req-04cb7474-2a77-42a0-9bf3-9fd67a447a7f req-5e254789-4b10-4ef5-a78f-ead30df9534a service nova] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Updating instance_info_cache with network_info: [{"id": "f210779b-302b-4a17-8b57-07837ea54e12", "address": "fa:16:3e:c3:c6:d1", "network": {"id": "ba8e9ff2-e562-462e-a2fa-0d7f643da26c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-83296950-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "648163a728fc4b28b85a24e9198d356b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf210779b-30", "ovs_interfaceid": "f210779b-302b-4a17-8b57-07837ea54e12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:47:28 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-04cb7474-2a77-42a0-9bf3-9fd67a447a7f req-5e254789-4b10-4ef5-a78f-ead30df9534a service nova] Releasing lock "refresh_cache-f8609da3-c26d-482a-bc03-017baf4bce22" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:47:28 user nova-compute[70954]: DEBUG nova.compute.manager [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Start spawning the instance on the hypervisor. {{(pid=70954) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 10:47:28 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Creating instance directory {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 10:47:28 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Creating image(s) Apr 21 10:47:28 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Acquiring lock "/opt/stack/data/nova/instances/aecf1ba8-9675-4535-874b-9084361b7693/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:28 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Lock "/opt/stack/data/nova/instances/aecf1ba8-9675-4535-874b-9084361b7693/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:28 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Lock "/opt/stack/data/nova/instances/aecf1ba8-9675-4535-874b-9084361b7693/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.011s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:28 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Acquiring lock "58ae37b3c1c1914a5d5d2a2fd7ec3bc1c5efa015" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:28 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Lock "58ae37b3c1c1914a5d5d2a2fd7ec3bc1c5efa015" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:28 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:28 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/58ae37b3c1c1914a5d5d2a2fd7ec3bc1c5efa015.part --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:47:28 user nova-compute[70954]: DEBUG nova.policy [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c0b1fae0fc8d4b9a998ab0679bace1a1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a3d2c4f2fb9f45559c4e51e86339a0e0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70954) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 10:47:29 user nova-compute[70954]: DEBUG nova.network.neutron [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Successfully updated port: fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d {{(pid=70954) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 10:47:29 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Acquiring lock "refresh_cache-15bf9321-a92e-4be2-bcae-a943988c811a" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:47:29 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Acquired lock "refresh_cache-15bf9321-a92e-4be2-bcae-a943988c811a" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:47:29 user nova-compute[70954]: DEBUG nova.network.neutron [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Building network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 10:47:29 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/58ae37b3c1c1914a5d5d2a2fd7ec3bc1c5efa015.part --force-share --output=json" returned: 0 in 0.177s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:47:29 user nova-compute[70954]: DEBUG nova.virt.images [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] 140913c9-4f31-4f27-b107-9d11bf6d2801 was qcow2, converting to raw {{(pid=70954) fetch_to_raw /opt/stack/nova/nova/virt/images.py:165}} Apr 21 10:47:29 user nova-compute[70954]: DEBUG nova.privsep.utils [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=70954) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 21 10:47:29 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/58ae37b3c1c1914a5d5d2a2fd7ec3bc1c5efa015.part /opt/stack/data/nova/instances/_base/58ae37b3c1c1914a5d5d2a2fd7ec3bc1c5efa015.converted {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:47:29 user nova-compute[70954]: DEBUG nova.compute.manager [req-122e2cc7-59c3-4a75-bbb5-7ab040a46380 req-c4bb832e-bfb2-4f0e-bc9b-1cc0c4807d87 service nova] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Received event network-changed-fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:47:29 user nova-compute[70954]: DEBUG nova.compute.manager [req-122e2cc7-59c3-4a75-bbb5-7ab040a46380 req-c4bb832e-bfb2-4f0e-bc9b-1cc0c4807d87 service nova] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Refreshing instance network info cache due to event network-changed-fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d. {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 10:47:29 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-122e2cc7-59c3-4a75-bbb5-7ab040a46380 req-c4bb832e-bfb2-4f0e-bc9b-1cc0c4807d87 service nova] Acquiring lock "refresh_cache-15bf9321-a92e-4be2-bcae-a943988c811a" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:47:29 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/58ae37b3c1c1914a5d5d2a2fd7ec3bc1c5efa015.part /opt/stack/data/nova/instances/_base/58ae37b3c1c1914a5d5d2a2fd7ec3bc1c5efa015.converted" returned: 0 in 0.159s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:47:29 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/58ae37b3c1c1914a5d5d2a2fd7ec3bc1c5efa015.converted --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:47:29 user nova-compute[70954]: DEBUG nova.network.neutron [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Instance cache missing network info. {{(pid=70954) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 10:47:29 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/58ae37b3c1c1914a5d5d2a2fd7ec3bc1c5efa015.converted --force-share --output=json" returned: 0 in 0.184s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:47:29 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Lock "58ae37b3c1c1914a5d5d2a2fd7ec3bc1c5efa015" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 1.081s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:29 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/58ae37b3c1c1914a5d5d2a2fd7ec3bc1c5efa015 --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:47:29 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/58ae37b3c1c1914a5d5d2a2fd7ec3bc1c5efa015 --force-share --output=json" returned: 0 in 0.244s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:47:29 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Acquiring lock "58ae37b3c1c1914a5d5d2a2fd7ec3bc1c5efa015" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:29 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Lock "58ae37b3c1c1914a5d5d2a2fd7ec3bc1c5efa015" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:29 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/58ae37b3c1c1914a5d5d2a2fd7ec3bc1c5efa015 --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:47:29 user nova-compute[70954]: DEBUG nova.compute.manager [req-d9fc2c83-2f60-43c1-b23a-be724824ad2c req-fec729b3-99d8-4785-927c-bf35667e7bd5 service nova] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Received event network-vif-plugged-f210779b-302b-4a17-8b57-07837ea54e12 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:47:29 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-d9fc2c83-2f60-43c1-b23a-be724824ad2c req-fec729b3-99d8-4785-927c-bf35667e7bd5 service nova] Acquiring lock "f8609da3-c26d-482a-bc03-017baf4bce22-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:29 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-d9fc2c83-2f60-43c1-b23a-be724824ad2c req-fec729b3-99d8-4785-927c-bf35667e7bd5 service nova] Lock "f8609da3-c26d-482a-bc03-017baf4bce22-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:29 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-d9fc2c83-2f60-43c1-b23a-be724824ad2c req-fec729b3-99d8-4785-927c-bf35667e7bd5 service nova] Lock "f8609da3-c26d-482a-bc03-017baf4bce22-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:29 user nova-compute[70954]: DEBUG nova.compute.manager [req-d9fc2c83-2f60-43c1-b23a-be724824ad2c req-fec729b3-99d8-4785-927c-bf35667e7bd5 service nova] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] No waiting events found dispatching network-vif-plugged-f210779b-302b-4a17-8b57-07837ea54e12 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:47:29 user nova-compute[70954]: WARNING nova.compute.manager [req-d9fc2c83-2f60-43c1-b23a-be724824ad2c req-fec729b3-99d8-4785-927c-bf35667e7bd5 service nova] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Received unexpected event network-vif-plugged-f210779b-302b-4a17-8b57-07837ea54e12 for instance with vm_state building and task_state spawning. Apr 21 10:47:29 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/58ae37b3c1c1914a5d5d2a2fd7ec3bc1c5efa015 --force-share --output=json" returned: 0 in 0.185s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:47:29 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/58ae37b3c1c1914a5d5d2a2fd7ec3bc1c5efa015,backing_fmt=raw /opt/stack/data/nova/instances/aecf1ba8-9675-4535-874b-9084361b7693/disk 1073741824 {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:47:29 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Resumed> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:47:29 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] VM Resumed (Lifecycle Event) Apr 21 10:47:29 user nova-compute[70954]: DEBUG nova.compute.manager [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Instance event wait completed in 0 seconds for {{(pid=70954) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 10:47:29 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Guest created on hypervisor {{(pid=70954) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 10:47:29 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:47:29 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/58ae37b3c1c1914a5d5d2a2fd7ec3bc1c5efa015,backing_fmt=raw /opt/stack/data/nova/instances/aecf1ba8-9675-4535-874b-9084361b7693/disk 1073741824" returned: 0 in 0.068s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:47:29 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Lock "58ae37b3c1c1914a5d5d2a2fd7ec3bc1c5efa015" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.256s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:29 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/58ae37b3c1c1914a5d5d2a2fd7ec3bc1c5efa015 --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:47:29 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Instance spawned successfully. Apr 21 10:47:30 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 10:47:30 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:47:30 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Found default for hw_cdrom_bus of ide {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:47:30 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Found default for hw_disk_bus of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:47:30 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Found default for hw_input_bus of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:47:30 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Found default for hw_pointer_model of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:47:30 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Found default for hw_video_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:47:30 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Found default for hw_vif_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:47:30 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 10:47:30 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Started> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:47:30 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] VM Started (Lifecycle Event) Apr 21 10:47:30 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/58ae37b3c1c1914a5d5d2a2fd7ec3bc1c5efa015 --force-share --output=json" returned: 0 in 0.154s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:47:30 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Checking if we can resize image /opt/stack/data/nova/instances/aecf1ba8-9675-4535-874b-9084361b7693/disk. size=1073741824 {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 10:47:30 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/aecf1ba8-9675-4535-874b-9084361b7693/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:47:30 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:47:30 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:47:30 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 10:47:30 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:30 user nova-compute[70954]: INFO nova.compute.manager [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Took 13.40 seconds to spawn the instance on the hypervisor. Apr 21 10:47:30 user nova-compute[70954]: DEBUG nova.compute.manager [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:47:30 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/aecf1ba8-9675-4535-874b-9084361b7693/disk --force-share --output=json" returned: 0 in 0.191s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:47:30 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Cannot resize image /opt/stack/data/nova/instances/aecf1ba8-9675-4535-874b-9084361b7693/disk to a smaller size. {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 10:47:30 user nova-compute[70954]: DEBUG nova.objects.instance [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Lazy-loading 'migration_context' on Instance uuid aecf1ba8-9675-4535-874b-9084361b7693 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:47:30 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Created local disks {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 10:47:30 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Ensure instance console log exists: /opt/stack/data/nova/instances/aecf1ba8-9675-4535-874b-9084361b7693/console.log {{(pid=70954) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 10:47:30 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:30 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:30 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:30 user nova-compute[70954]: INFO nova.compute.manager [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Took 14.18 seconds to build instance. Apr 21 10:47:30 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-42b5fd9e-b0fb-4f3f-bae5-1ae5b57eac93 tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Lock "f8609da3-c26d-482a-bc03-017baf4bce22" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 14.313s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:30 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:30 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:30 user nova-compute[70954]: DEBUG nova.network.neutron [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Successfully updated port: b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c {{(pid=70954) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 10:47:30 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Acquiring lock "refresh_cache-dd4d15a1-3a71-49e8-9851-9b49fec6a9e3" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:47:30 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Acquired lock "refresh_cache-dd4d15a1-3a71-49e8-9851-9b49fec6a9e3" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:47:30 user nova-compute[70954]: DEBUG nova.network.neutron [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Building network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG nova.network.neutron [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Instance cache missing network info. {{(pid=70954) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG nova.compute.manager [req-fdf54860-9b04-419a-96d4-fdd3477b2a75 req-946c63b7-60be-47d6-80fe-3a705abdaa0c service nova] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Received event network-changed-b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG nova.compute.manager [req-fdf54860-9b04-419a-96d4-fdd3477b2a75 req-946c63b7-60be-47d6-80fe-3a705abdaa0c service nova] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Refreshing instance network info cache due to event network-changed-b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c. {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-fdf54860-9b04-419a-96d4-fdd3477b2a75 req-946c63b7-60be-47d6-80fe-3a705abdaa0c service nova] Acquiring lock "refresh_cache-dd4d15a1-3a71-49e8-9851-9b49fec6a9e3" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG nova.network.neutron [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Updating instance_info_cache with network_info: [{"id": "fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d", "address": "fa:16:3e:60:6e:bf", "network": {"id": "ba9d5253-efcc-4b0a-8cda-778a5a337551", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-310377863-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "4bdd7a4ccfc340aa9c1b02c57f7a0e70", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfca8b6a6-fd", "ovs_interfaceid": "fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Releasing lock "refresh_cache-15bf9321-a92e-4be2-bcae-a943988c811a" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG nova.compute.manager [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Instance network_info: |[{"id": "fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d", "address": "fa:16:3e:60:6e:bf", "network": {"id": "ba9d5253-efcc-4b0a-8cda-778a5a337551", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-310377863-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "4bdd7a4ccfc340aa9c1b02c57f7a0e70", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfca8b6a6-fd", "ovs_interfaceid": "fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-122e2cc7-59c3-4a75-bbb5-7ab040a46380 req-c4bb832e-bfb2-4f0e-bc9b-1cc0c4807d87 service nova] Acquired lock "refresh_cache-15bf9321-a92e-4be2-bcae-a943988c811a" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG nova.network.neutron [req-122e2cc7-59c3-4a75-bbb5-7ab040a46380 req-c4bb832e-bfb2-4f0e-bc9b-1cc0c4807d87 service nova] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Refreshing network info cache for port fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Start _get_guest_xml network_info=[{"id": "fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d", "address": "fa:16:3e:60:6e:bf", "network": {"id": "ba9d5253-efcc-4b0a-8cda-778a5a337551", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-310377863-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "4bdd7a4ccfc340aa9c1b02c57f7a0e70", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfca8b6a6-fd", "ovs_interfaceid": "fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:43:25Z,direct_url=,disk_format='qcow2',id=3b29a01a-1fc0-4d0d-89fb-23d22b2de02e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='a3109aa78f014d0da3638064a889676d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:43:26Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'size': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'disk_bus': 'virtio', 'encryption_format': None, 'guest_format': None, 'image_id': '3b29a01a-1fc0-4d0d-89fb-23d22b2de02e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 10:47:31 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:47:31 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:47:31 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70954) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T10:44:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:43:25Z,direct_url=,disk_format='qcow2',id=3b29a01a-1fc0-4d0d-89fb-23d22b2de02e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='a3109aa78f014d0da3638064a889676d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:43:26Z,virtual_size=,visibility=), allow threads: True {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Flavor limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Image limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Flavor pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Image pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Got 1 possible topologies {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:47:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-830796194',display_name='tempest-ServerStableDeviceRescueTest-server-830796194',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverstabledevicerescuetest-server-830796194',id=4,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHOLDYACefR6pJbo/1FNlJcU0uVLNhwHibyQxq6uGxw58mBzlostFRjitCW5kqYo4/rT+TGHwIPAMOKYgrhYN17TXx7fyo6rQDJa7QLpDa2shAHPXuXXSRjnzvc+xkQMdw==',key_name='tempest-keypair-474843765',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4bdd7a4ccfc340aa9c1b02c57f7a0e70',ramdisk_id='',reservation_id='r-4605i77j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-335595160',owner_user_name='tempest-ServerStableDeviceRescueTest-335595160-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:47:21Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d172d648a9474db082646a47a2840214',uuid=15bf9321-a92e-4be2-bcae-a943988c811a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d", "address": "fa:16:3e:60:6e:bf", "network": {"id": "ba9d5253-efcc-4b0a-8cda-778a5a337551", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-310377863-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "4bdd7a4ccfc340aa9c1b02c57f7a0e70", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfca8b6a6-fd", "ovs_interfaceid": "fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70954) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Converting VIF {"id": "fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d", "address": "fa:16:3e:60:6e:bf", "network": {"id": "ba9d5253-efcc-4b0a-8cda-778a5a337551", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-310377863-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "4bdd7a4ccfc340aa9c1b02c57f7a0e70", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfca8b6a6-fd", "ovs_interfaceid": "fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:6e:bf,bridge_name='br-int',has_traffic_filtering=True,id=fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d,network=Network(ba9d5253-efcc-4b0a-8cda-778a5a337551),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfca8b6a6-fd') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG nova.objects.instance [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Lazy-loading 'pci_devices' on Instance uuid 15bf9321-a92e-4be2-bcae-a943988c811a {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] End _get_guest_xml xml= Apr 21 10:47:31 user nova-compute[70954]: 15bf9321-a92e-4be2-bcae-a943988c811a Apr 21 10:47:31 user nova-compute[70954]: instance-00000004 Apr 21 10:47:31 user nova-compute[70954]: 131072 Apr 21 10:47:31 user nova-compute[70954]: 1 Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: tempest-ServerStableDeviceRescueTest-server-830796194 Apr 21 10:47:31 user nova-compute[70954]: 2023-04-21 10:47:31 Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: 128 Apr 21 10:47:31 user nova-compute[70954]: 1 Apr 21 10:47:31 user nova-compute[70954]: 0 Apr 21 10:47:31 user nova-compute[70954]: 0 Apr 21 10:47:31 user nova-compute[70954]: 1 Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: tempest-ServerStableDeviceRescueTest-335595160-project-member Apr 21 10:47:31 user nova-compute[70954]: tempest-ServerStableDeviceRescueTest-335595160 Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: OpenStack Foundation Apr 21 10:47:31 user nova-compute[70954]: OpenStack Nova Apr 21 10:47:31 user nova-compute[70954]: 0.0.0 Apr 21 10:47:31 user nova-compute[70954]: 15bf9321-a92e-4be2-bcae-a943988c811a Apr 21 10:47:31 user nova-compute[70954]: 15bf9321-a92e-4be2-bcae-a943988c811a Apr 21 10:47:31 user nova-compute[70954]: Virtual Machine Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: hvm Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Nehalem Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: /dev/urandom Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: Apr 21 10:47:31 user nova-compute[70954]: {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:47:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-830796194',display_name='tempest-ServerStableDeviceRescueTest-server-830796194',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverstabledevicerescuetest-server-830796194',id=4,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHOLDYACefR6pJbo/1FNlJcU0uVLNhwHibyQxq6uGxw58mBzlostFRjitCW5kqYo4/rT+TGHwIPAMOKYgrhYN17TXx7fyo6rQDJa7QLpDa2shAHPXuXXSRjnzvc+xkQMdw==',key_name='tempest-keypair-474843765',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4bdd7a4ccfc340aa9c1b02c57f7a0e70',ramdisk_id='',reservation_id='r-4605i77j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-335595160',owner_user_name='tempest-ServerStableDeviceRescueTest-335595160-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:47:21Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d172d648a9474db082646a47a2840214',uuid=15bf9321-a92e-4be2-bcae-a943988c811a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d", "address": "fa:16:3e:60:6e:bf", "network": {"id": "ba9d5253-efcc-4b0a-8cda-778a5a337551", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-310377863-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "4bdd7a4ccfc340aa9c1b02c57f7a0e70", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfca8b6a6-fd", "ovs_interfaceid": "fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Converting VIF {"id": "fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d", "address": "fa:16:3e:60:6e:bf", "network": {"id": "ba9d5253-efcc-4b0a-8cda-778a5a337551", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-310377863-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "4bdd7a4ccfc340aa9c1b02c57f7a0e70", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfca8b6a6-fd", "ovs_interfaceid": "fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:6e:bf,bridge_name='br-int',has_traffic_filtering=True,id=fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d,network=Network(ba9d5253-efcc-4b0a-8cda-778a5a337551),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfca8b6a6-fd') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG os_vif [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:6e:bf,bridge_name='br-int',has_traffic_filtering=True,id=fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d,network=Network(ba9d5253-efcc-4b0a-8cda-778a5a337551),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfca8b6a6-fd') {{(pid=70954) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfca8b6a6-fd, may_exist=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfca8b6a6-fd, col_values=(('external_ids', {'iface-id': 'fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:60:6e:bf', 'vm-uuid': '15bf9321-a92e-4be2-bcae-a943988c811a'}),)) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:31 user nova-compute[70954]: INFO os_vif [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:6e:bf,bridge_name='br-int',has_traffic_filtering=True,id=fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d,network=Network(ba9d5253-efcc-4b0a-8cda-778a5a337551),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfca8b6a6-fd') Apr 21 10:47:31 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] No BDM found with device name vda, not building metadata. {{(pid=70954) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] No VIF found with MAC fa:16:3e:60:6e:bf, not building metadata {{(pid=70954) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG nova.compute.manager [req-9b5b030c-52bd-4a2f-80af-522db964cd0c req-b256ed78-eafc-4b0c-b43e-f8e31a5b01c5 service nova] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Received event network-vif-plugged-f210779b-302b-4a17-8b57-07837ea54e12 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-9b5b030c-52bd-4a2f-80af-522db964cd0c req-b256ed78-eafc-4b0c-b43e-f8e31a5b01c5 service nova] Acquiring lock "f8609da3-c26d-482a-bc03-017baf4bce22-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-9b5b030c-52bd-4a2f-80af-522db964cd0c req-b256ed78-eafc-4b0c-b43e-f8e31a5b01c5 service nova] Lock "f8609da3-c26d-482a-bc03-017baf4bce22-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-9b5b030c-52bd-4a2f-80af-522db964cd0c req-b256ed78-eafc-4b0c-b43e-f8e31a5b01c5 service nova] Lock "f8609da3-c26d-482a-bc03-017baf4bce22-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:31 user nova-compute[70954]: DEBUG nova.compute.manager [req-9b5b030c-52bd-4a2f-80af-522db964cd0c req-b256ed78-eafc-4b0c-b43e-f8e31a5b01c5 service nova] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] No waiting events found dispatching network-vif-plugged-f210779b-302b-4a17-8b57-07837ea54e12 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:47:31 user nova-compute[70954]: WARNING nova.compute.manager [req-9b5b030c-52bd-4a2f-80af-522db964cd0c req-b256ed78-eafc-4b0c-b43e-f8e31a5b01c5 service nova] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Received unexpected event network-vif-plugged-f210779b-302b-4a17-8b57-07837ea54e12 for instance with vm_state active and task_state None. Apr 21 10:47:32 user nova-compute[70954]: DEBUG nova.network.neutron [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Updating instance_info_cache with network_info: [{"id": "b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c", "address": "fa:16:3e:c4:4b:e7", "network": {"id": "b24b52ac-b8ab-493e-994c-c38d7c5c7089", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1354809025-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d85f51547e5244e495343281725fe320", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb08cd847-5a", "ovs_interfaceid": "b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:47:32 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Releasing lock "refresh_cache-dd4d15a1-3a71-49e8-9851-9b49fec6a9e3" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:47:32 user nova-compute[70954]: DEBUG nova.compute.manager [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Instance network_info: |[{"id": "b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c", "address": "fa:16:3e:c4:4b:e7", "network": {"id": "b24b52ac-b8ab-493e-994c-c38d7c5c7089", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1354809025-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d85f51547e5244e495343281725fe320", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb08cd847-5a", "ovs_interfaceid": "b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 10:47:32 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-fdf54860-9b04-419a-96d4-fdd3477b2a75 req-946c63b7-60be-47d6-80fe-3a705abdaa0c service nova] Acquired lock "refresh_cache-dd4d15a1-3a71-49e8-9851-9b49fec6a9e3" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:47:32 user nova-compute[70954]: DEBUG nova.network.neutron [req-fdf54860-9b04-419a-96d4-fdd3477b2a75 req-946c63b7-60be-47d6-80fe-3a705abdaa0c service nova] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Refreshing network info cache for port b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 10:47:32 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Start _get_guest_xml network_info=[{"id": "b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c", "address": "fa:16:3e:c4:4b:e7", "network": {"id": "b24b52ac-b8ab-493e-994c-c38d7c5c7089", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1354809025-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d85f51547e5244e495343281725fe320", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb08cd847-5a", "ovs_interfaceid": "b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:43:25Z,direct_url=,disk_format='qcow2',id=3b29a01a-1fc0-4d0d-89fb-23d22b2de02e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='a3109aa78f014d0da3638064a889676d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:43:26Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'size': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'disk_bus': 'virtio', 'encryption_format': None, 'guest_format': None, 'image_id': '3b29a01a-1fc0-4d0d-89fb-23d22b2de02e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 10:47:32 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:47:32 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:47:32 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70954) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 10:47:32 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T10:44:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:43:25Z,direct_url=,disk_format='qcow2',id=3b29a01a-1fc0-4d0d-89fb-23d22b2de02e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='a3109aa78f014d0da3638064a889676d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:43:26Z,virtual_size=,visibility=), allow threads: True {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 10:47:32 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Flavor limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 10:47:32 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Image limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 10:47:32 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Flavor pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 10:47:32 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Image pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 10:47:32 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 10:47:32 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 10:47:32 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 10:47:32 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Got 1 possible topologies {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 10:47:32 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 10:47:32 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 10:47:32 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:47:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-721132263',display_name='tempest-AttachVolumeTestJSON-server-721132263',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-721132263',id=5,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE61zuAXT232aH/KOTnubmgBMkuEfigCy73bZO4uuf2B23JR41s8cx2vf+RH51d7wxX9P1MtxP7zNqYI2bDeqdfZasdq2OLkldcjqDGH3vLtRM+8mAr7ZBtqN4SKtJs0UQ==',key_name='tempest-keypair-495885922',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d85f51547e5244e495343281725fe320',ramdisk_id='',reservation_id='r-0s4941kd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-2130575493',owner_user_name='tempest-AttachVolumeTestJSON-2130575493-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:47:22Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='25fb0d890b594080bb1bb99dd6294ff1',uuid=dd4d15a1-3a71-49e8-9851-9b49fec6a9e3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c", "address": "fa:16:3e:c4:4b:e7", "network": {"id": "b24b52ac-b8ab-493e-994c-c38d7c5c7089", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1354809025-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d85f51547e5244e495343281725fe320", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb08cd847-5a", "ovs_interfaceid": "b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70954) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 10:47:32 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Converting VIF {"id": "b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c", "address": "fa:16:3e:c4:4b:e7", "network": {"id": "b24b52ac-b8ab-493e-994c-c38d7c5c7089", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1354809025-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d85f51547e5244e495343281725fe320", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb08cd847-5a", "ovs_interfaceid": "b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:47:32 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c4:4b:e7,bridge_name='br-int',has_traffic_filtering=True,id=b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c,network=Network(b24b52ac-b8ab-493e-994c-c38d7c5c7089),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb08cd847-5a') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:47:32 user nova-compute[70954]: DEBUG nova.objects.instance [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Lazy-loading 'pci_devices' on Instance uuid dd4d15a1-3a71-49e8-9851-9b49fec6a9e3 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:47:32 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] End _get_guest_xml xml= Apr 21 10:47:32 user nova-compute[70954]: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3 Apr 21 10:47:32 user nova-compute[70954]: instance-00000005 Apr 21 10:47:32 user nova-compute[70954]: 131072 Apr 21 10:47:32 user nova-compute[70954]: 1 Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: tempest-AttachVolumeTestJSON-server-721132263 Apr 21 10:47:32 user nova-compute[70954]: 2023-04-21 10:47:32 Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: 128 Apr 21 10:47:32 user nova-compute[70954]: 1 Apr 21 10:47:32 user nova-compute[70954]: 0 Apr 21 10:47:32 user nova-compute[70954]: 0 Apr 21 10:47:32 user nova-compute[70954]: 1 Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: tempest-AttachVolumeTestJSON-2130575493-project-member Apr 21 10:47:32 user nova-compute[70954]: tempest-AttachVolumeTestJSON-2130575493 Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: OpenStack Foundation Apr 21 10:47:32 user nova-compute[70954]: OpenStack Nova Apr 21 10:47:32 user nova-compute[70954]: 0.0.0 Apr 21 10:47:32 user nova-compute[70954]: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3 Apr 21 10:47:32 user nova-compute[70954]: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3 Apr 21 10:47:32 user nova-compute[70954]: Virtual Machine Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: hvm Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Nehalem Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: /dev/urandom Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: Apr 21 10:47:32 user nova-compute[70954]: {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 10:47:32 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:47:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-721132263',display_name='tempest-AttachVolumeTestJSON-server-721132263',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-721132263',id=5,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE61zuAXT232aH/KOTnubmgBMkuEfigCy73bZO4uuf2B23JR41s8cx2vf+RH51d7wxX9P1MtxP7zNqYI2bDeqdfZasdq2OLkldcjqDGH3vLtRM+8mAr7ZBtqN4SKtJs0UQ==',key_name='tempest-keypair-495885922',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d85f51547e5244e495343281725fe320',ramdisk_id='',reservation_id='r-0s4941kd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-2130575493',owner_user_name='tempest-AttachVolumeTestJSON-2130575493-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:47:22Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='25fb0d890b594080bb1bb99dd6294ff1',uuid=dd4d15a1-3a71-49e8-9851-9b49fec6a9e3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c", "address": "fa:16:3e:c4:4b:e7", "network": {"id": "b24b52ac-b8ab-493e-994c-c38d7c5c7089", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1354809025-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d85f51547e5244e495343281725fe320", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb08cd847-5a", "ovs_interfaceid": "b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 10:47:32 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Converting VIF {"id": "b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c", "address": "fa:16:3e:c4:4b:e7", "network": {"id": "b24b52ac-b8ab-493e-994c-c38d7c5c7089", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1354809025-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d85f51547e5244e495343281725fe320", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb08cd847-5a", "ovs_interfaceid": "b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:47:32 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c4:4b:e7,bridge_name='br-int',has_traffic_filtering=True,id=b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c,network=Network(b24b52ac-b8ab-493e-994c-c38d7c5c7089),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb08cd847-5a') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:47:32 user nova-compute[70954]: DEBUG os_vif [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c4:4b:e7,bridge_name='br-int',has_traffic_filtering=True,id=b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c,network=Network(b24b52ac-b8ab-493e-994c-c38d7c5c7089),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb08cd847-5a') {{(pid=70954) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 10:47:32 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:32 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:47:32 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 10:47:32 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:32 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb08cd847-5a, may_exist=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:47:32 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb08cd847-5a, col_values=(('external_ids', {'iface-id': 'b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c4:4b:e7', 'vm-uuid': 'dd4d15a1-3a71-49e8-9851-9b49fec6a9e3'}),)) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:47:32 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:32 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:47:32 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:32 user nova-compute[70954]: INFO os_vif [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c4:4b:e7,bridge_name='br-int',has_traffic_filtering=True,id=b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c,network=Network(b24b52ac-b8ab-493e-994c-c38d7c5c7089),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb08cd847-5a') Apr 21 10:47:32 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] No BDM found with device name vda, not building metadata. {{(pid=70954) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 10:47:32 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] No VIF found with MAC fa:16:3e:c4:4b:e7, not building metadata {{(pid=70954) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 10:47:32 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:33 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Acquiring lock "8ae797bd-c587-43a3-b941-e6d6d6c74e51" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:33 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Lock "8ae797bd-c587-43a3-b941-e6d6d6c74e51" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:33 user nova-compute[70954]: DEBUG nova.compute.manager [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Starting instance... {{(pid=70954) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 10:47:33 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:33 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:33 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70954) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 10:47:33 user nova-compute[70954]: INFO nova.compute.claims [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Claim successful on node user Apr 21 10:47:33 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:33 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:33 user nova-compute[70954]: DEBUG nova.network.neutron [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Successfully created port: 892719ba-88a5-4998-9b27-c47babc15f5c {{(pid=70954) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 10:47:33 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:47:33 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:47:33 user nova-compute[70954]: DEBUG nova.network.neutron [req-122e2cc7-59c3-4a75-bbb5-7ab040a46380 req-c4bb832e-bfb2-4f0e-bc9b-1cc0c4807d87 service nova] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Updated VIF entry in instance network info cache for port fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d. {{(pid=70954) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 10:47:33 user nova-compute[70954]: DEBUG nova.network.neutron [req-122e2cc7-59c3-4a75-bbb5-7ab040a46380 req-c4bb832e-bfb2-4f0e-bc9b-1cc0c4807d87 service nova] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Updating instance_info_cache with network_info: [{"id": "fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d", "address": "fa:16:3e:60:6e:bf", "network": {"id": "ba9d5253-efcc-4b0a-8cda-778a5a337551", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-310377863-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "4bdd7a4ccfc340aa9c1b02c57f7a0e70", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfca8b6a6-fd", "ovs_interfaceid": "fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:47:33 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.728s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:33 user nova-compute[70954]: DEBUG nova.compute.manager [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Start building networks asynchronously for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 10:47:34 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-122e2cc7-59c3-4a75-bbb5-7ab040a46380 req-c4bb832e-bfb2-4f0e-bc9b-1cc0c4807d87 service nova] Releasing lock "refresh_cache-15bf9321-a92e-4be2-bcae-a943988c811a" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:47:34 user nova-compute[70954]: DEBUG nova.compute.manager [req-b2e7db6f-7790-4001-ba3c-e9e41848b688 req-92364ab3-77f6-48da-ada1-dc68215c50bd service nova] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Received event network-vif-plugged-fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:47:34 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-b2e7db6f-7790-4001-ba3c-e9e41848b688 req-92364ab3-77f6-48da-ada1-dc68215c50bd service nova] Acquiring lock "15bf9321-a92e-4be2-bcae-a943988c811a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:34 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-b2e7db6f-7790-4001-ba3c-e9e41848b688 req-92364ab3-77f6-48da-ada1-dc68215c50bd service nova] Lock "15bf9321-a92e-4be2-bcae-a943988c811a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:34 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-b2e7db6f-7790-4001-ba3c-e9e41848b688 req-92364ab3-77f6-48da-ada1-dc68215c50bd service nova] Lock "15bf9321-a92e-4be2-bcae-a943988c811a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:34 user nova-compute[70954]: DEBUG nova.compute.manager [req-b2e7db6f-7790-4001-ba3c-e9e41848b688 req-92364ab3-77f6-48da-ada1-dc68215c50bd service nova] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] No waiting events found dispatching network-vif-plugged-fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:47:34 user nova-compute[70954]: WARNING nova.compute.manager [req-b2e7db6f-7790-4001-ba3c-e9e41848b688 req-92364ab3-77f6-48da-ada1-dc68215c50bd service nova] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Received unexpected event network-vif-plugged-fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d for instance with vm_state building and task_state spawning. Apr 21 10:47:34 user nova-compute[70954]: DEBUG nova.compute.manager [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Allocating IP information in the background. {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 10:47:34 user nova-compute[70954]: DEBUG nova.network.neutron [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] allocate_for_instance() {{(pid=70954) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 10:47:34 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 10:47:34 user nova-compute[70954]: DEBUG nova.compute.manager [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Start building block device mappings for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 10:47:34 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:34 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:34 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:34 user nova-compute[70954]: DEBUG nova.compute.manager [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Start spawning the instance on the hypervisor. {{(pid=70954) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 10:47:34 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Creating instance directory {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 10:47:34 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Creating image(s) Apr 21 10:47:34 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Acquiring lock "/opt/stack/data/nova/instances/8ae797bd-c587-43a3-b941-e6d6d6c74e51/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:34 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Lock "/opt/stack/data/nova/instances/8ae797bd-c587-43a3-b941-e6d6d6c74e51/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:34 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Lock "/opt/stack/data/nova/instances/8ae797bd-c587-43a3-b941-e6d6d6c74e51/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.004s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:34 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:47:34 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:34 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.171s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:47:34 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Acquiring lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:34 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:34 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:47:34 user nova-compute[70954]: DEBUG nova.policy [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c600e01acfe140cabcdfe54958e66108', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '94e77e1735854e0c966c42e9a613017f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70954) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 10:47:34 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.158s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:47:34 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/8ae797bd-c587-43a3-b941-e6d6d6c74e51/disk 1073741824 {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:47:34 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/8ae797bd-c587-43a3-b941-e6d6d6c74e51/disk 1073741824" returned: 0 in 0.049s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:47:34 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.214s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:34 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:47:34 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.167s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:47:34 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Checking if we can resize image /opt/stack/data/nova/instances/8ae797bd-c587-43a3-b941-e6d6d6c74e51/disk. size=1073741824 {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 10:47:34 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8ae797bd-c587-43a3-b941-e6d6d6c74e51/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:47:34 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:34 user nova-compute[70954]: DEBUG nova.network.neutron [req-fdf54860-9b04-419a-96d4-fdd3477b2a75 req-946c63b7-60be-47d6-80fe-3a705abdaa0c service nova] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Updated VIF entry in instance network info cache for port b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c. {{(pid=70954) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 10:47:34 user nova-compute[70954]: DEBUG nova.network.neutron [req-fdf54860-9b04-419a-96d4-fdd3477b2a75 req-946c63b7-60be-47d6-80fe-3a705abdaa0c service nova] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Updating instance_info_cache with network_info: [{"id": "b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c", "address": "fa:16:3e:c4:4b:e7", "network": {"id": "b24b52ac-b8ab-493e-994c-c38d7c5c7089", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1354809025-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d85f51547e5244e495343281725fe320", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb08cd847-5a", "ovs_interfaceid": "b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:47:34 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:34 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-fdf54860-9b04-419a-96d4-fdd3477b2a75 req-946c63b7-60be-47d6-80fe-3a705abdaa0c service nova] Releasing lock "refresh_cache-dd4d15a1-3a71-49e8-9851-9b49fec6a9e3" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:47:35 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8ae797bd-c587-43a3-b941-e6d6d6c74e51/disk --force-share --output=json" returned: 0 in 0.204s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:47:35 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Cannot resize image /opt/stack/data/nova/instances/8ae797bd-c587-43a3-b941-e6d6d6c74e51/disk to a smaller size. {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 10:47:35 user nova-compute[70954]: DEBUG nova.objects.instance [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Lazy-loading 'migration_context' on Instance uuid 8ae797bd-c587-43a3-b941-e6d6d6c74e51 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:47:35 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Created local disks {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 10:47:35 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Ensure instance console log exists: /opt/stack/data/nova/instances/8ae797bd-c587-43a3-b941-e6d6d6c74e51/console.log {{(pid=70954) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 10:47:35 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:35 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:35 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:35 user nova-compute[70954]: DEBUG nova.compute.manager [req-659ed031-bb52-427c-ba3a-b0bda3d29727 req-9e10b87b-49bc-4720-96c7-970b9f8b1e12 service nova] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Received event network-vif-plugged-b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:47:35 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-659ed031-bb52-427c-ba3a-b0bda3d29727 req-9e10b87b-49bc-4720-96c7-970b9f8b1e12 service nova] Acquiring lock "dd4d15a1-3a71-49e8-9851-9b49fec6a9e3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:35 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-659ed031-bb52-427c-ba3a-b0bda3d29727 req-9e10b87b-49bc-4720-96c7-970b9f8b1e12 service nova] Lock "dd4d15a1-3a71-49e8-9851-9b49fec6a9e3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:35 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-659ed031-bb52-427c-ba3a-b0bda3d29727 req-9e10b87b-49bc-4720-96c7-970b9f8b1e12 service nova] Lock "dd4d15a1-3a71-49e8-9851-9b49fec6a9e3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:35 user nova-compute[70954]: DEBUG nova.compute.manager [req-659ed031-bb52-427c-ba3a-b0bda3d29727 req-9e10b87b-49bc-4720-96c7-970b9f8b1e12 service nova] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] No waiting events found dispatching network-vif-plugged-b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:47:35 user nova-compute[70954]: WARNING nova.compute.manager [req-659ed031-bb52-427c-ba3a-b0bda3d29727 req-9e10b87b-49bc-4720-96c7-970b9f8b1e12 service nova] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Received unexpected event network-vif-plugged-b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c for instance with vm_state building and task_state spawning. Apr 21 10:47:35 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:35 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:35 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:36 user nova-compute[70954]: DEBUG nova.compute.manager [req-7c9bd302-a93e-4fb5-b244-ee3604229246 req-3671626f-851f-44b2-940f-0a9cdf89847d service nova] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Received event network-vif-plugged-fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:47:36 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-7c9bd302-a93e-4fb5-b244-ee3604229246 req-3671626f-851f-44b2-940f-0a9cdf89847d service nova] Acquiring lock "15bf9321-a92e-4be2-bcae-a943988c811a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:36 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-7c9bd302-a93e-4fb5-b244-ee3604229246 req-3671626f-851f-44b2-940f-0a9cdf89847d service nova] Lock "15bf9321-a92e-4be2-bcae-a943988c811a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:36 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-7c9bd302-a93e-4fb5-b244-ee3604229246 req-3671626f-851f-44b2-940f-0a9cdf89847d service nova] Lock "15bf9321-a92e-4be2-bcae-a943988c811a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:36 user nova-compute[70954]: DEBUG nova.compute.manager [req-7c9bd302-a93e-4fb5-b244-ee3604229246 req-3671626f-851f-44b2-940f-0a9cdf89847d service nova] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] No waiting events found dispatching network-vif-plugged-fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:47:36 user nova-compute[70954]: WARNING nova.compute.manager [req-7c9bd302-a93e-4fb5-b244-ee3604229246 req-3671626f-851f-44b2-940f-0a9cdf89847d service nova] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Received unexpected event network-vif-plugged-fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d for instance with vm_state building and task_state spawning. Apr 21 10:47:37 user nova-compute[70954]: DEBUG nova.network.neutron [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Successfully updated port: 892719ba-88a5-4998-9b27-c47babc15f5c {{(pid=70954) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 10:47:37 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Acquiring lock "refresh_cache-aecf1ba8-9675-4535-874b-9084361b7693" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:47:37 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Acquired lock "refresh_cache-aecf1ba8-9675-4535-874b-9084361b7693" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:47:37 user nova-compute[70954]: DEBUG nova.network.neutron [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Building network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 10:47:37 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:37 user nova-compute[70954]: DEBUG nova.network.neutron [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Instance cache missing network info. {{(pid=70954) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 10:47:37 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:37 user nova-compute[70954]: DEBUG nova.compute.manager [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Instance event wait completed in 0 seconds for {{(pid=70954) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 10:47:37 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Guest created on hypervisor {{(pid=70954) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 10:47:37 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Resumed> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:47:37 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] VM Resumed (Lifecycle Event) Apr 21 10:47:37 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Instance spawned successfully. Apr 21 10:47:37 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 10:47:37 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:47:37 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:47:37 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Found default for hw_cdrom_bus of ide {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:47:37 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Found default for hw_disk_bus of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:47:37 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Found default for hw_input_bus of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:47:37 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Found default for hw_pointer_model of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:47:37 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Found default for hw_video_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:47:37 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Found default for hw_vif_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:47:37 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 10:47:37 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Started> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:47:37 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] VM Started (Lifecycle Event) Apr 21 10:47:37 user nova-compute[70954]: DEBUG nova.network.neutron [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Successfully created port: 44d4e2d5-0850-4b05-9d97-f3916611f340 {{(pid=70954) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 10:47:37 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:37 user nova-compute[70954]: DEBUG nova.compute.manager [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Instance event wait completed in 0 seconds for {{(pid=70954) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 10:47:37 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Guest created on hypervisor {{(pid=70954) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 10:47:37 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:47:37 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Instance spawned successfully. Apr 21 10:47:37 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 10:47:37 user nova-compute[70954]: DEBUG nova.compute.manager [req-3ae92414-403f-4130-a20a-d37934f0472b req-6b2fbb80-1a95-47e3-8dba-f1f536bb44ec service nova] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Received event network-vif-plugged-b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:47:37 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-3ae92414-403f-4130-a20a-d37934f0472b req-6b2fbb80-1a95-47e3-8dba-f1f536bb44ec service nova] Acquiring lock "dd4d15a1-3a71-49e8-9851-9b49fec6a9e3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:37 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-3ae92414-403f-4130-a20a-d37934f0472b req-6b2fbb80-1a95-47e3-8dba-f1f536bb44ec service nova] Lock "dd4d15a1-3a71-49e8-9851-9b49fec6a9e3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:37 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-3ae92414-403f-4130-a20a-d37934f0472b req-6b2fbb80-1a95-47e3-8dba-f1f536bb44ec service nova] Lock "dd4d15a1-3a71-49e8-9851-9b49fec6a9e3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:37 user nova-compute[70954]: DEBUG nova.compute.manager [req-3ae92414-403f-4130-a20a-d37934f0472b req-6b2fbb80-1a95-47e3-8dba-f1f536bb44ec service nova] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] No waiting events found dispatching network-vif-plugged-b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:47:37 user nova-compute[70954]: WARNING nova.compute.manager [req-3ae92414-403f-4130-a20a-d37934f0472b req-6b2fbb80-1a95-47e3-8dba-f1f536bb44ec service nova] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Received unexpected event network-vif-plugged-b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c for instance with vm_state building and task_state spawning. Apr 21 10:47:37 user nova-compute[70954]: DEBUG nova.compute.manager [req-3ae92414-403f-4130-a20a-d37934f0472b req-6b2fbb80-1a95-47e3-8dba-f1f536bb44ec service nova] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Received event network-changed-892719ba-88a5-4998-9b27-c47babc15f5c {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:47:37 user nova-compute[70954]: DEBUG nova.compute.manager [req-3ae92414-403f-4130-a20a-d37934f0472b req-6b2fbb80-1a95-47e3-8dba-f1f536bb44ec service nova] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Refreshing instance network info cache due to event network-changed-892719ba-88a5-4998-9b27-c47babc15f5c. {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 10:47:37 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-3ae92414-403f-4130-a20a-d37934f0472b req-6b2fbb80-1a95-47e3-8dba-f1f536bb44ec service nova] Acquiring lock "refresh_cache-aecf1ba8-9675-4535-874b-9084361b7693" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:47:37 user nova-compute[70954]: INFO nova.compute.manager [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Took 16.91 seconds to spawn the instance on the hypervisor. Apr 21 10:47:37 user nova-compute[70954]: DEBUG nova.compute.manager [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:47:37 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:47:37 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Found default for hw_cdrom_bus of ide {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:47:37 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Found default for hw_disk_bus of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:47:37 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Found default for hw_input_bus of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:47:37 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Found default for hw_pointer_model of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:47:37 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Found default for hw_video_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:47:37 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Found default for hw_vif_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:47:37 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 10:47:37 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Resumed> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:47:37 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] VM Resumed (Lifecycle Event) Apr 21 10:47:37 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:47:37 user nova-compute[70954]: INFO nova.compute.manager [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Took 17.70 seconds to build instance. Apr 21 10:47:37 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:47:37 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-cdcf6209-6d7d-4344-ae1d-788df8b62401 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Lock "15bf9321-a92e-4be2-bcae-a943988c811a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 17.976s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:37 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 10:47:37 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Started> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:47:37 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] VM Started (Lifecycle Event) Apr 21 10:47:37 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:47:37 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:47:37 user nova-compute[70954]: INFO nova.compute.manager [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Took 16.04 seconds to spawn the instance on the hypervisor. Apr 21 10:47:37 user nova-compute[70954]: DEBUG nova.compute.manager [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:47:37 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 10:47:38 user nova-compute[70954]: INFO nova.compute.manager [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Took 17.16 seconds to build instance. Apr 21 10:47:38 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-4ee85eb8-d9c0-4b24-8eee-b9f0337457ab tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Lock "dd4d15a1-3a71-49e8-9851-9b49fec6a9e3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 17.325s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:38 user nova-compute[70954]: DEBUG nova.network.neutron [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Updating instance_info_cache with network_info: [{"id": "892719ba-88a5-4998-9b27-c47babc15f5c", "address": "fa:16:3e:26:33:fd", "network": {"id": "db5893fb-88b3-400d-8c55-c8f24d5b9084", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-271940369-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a3d2c4f2fb9f45559c4e51e86339a0e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap892719ba-88", "ovs_interfaceid": "892719ba-88a5-4998-9b27-c47babc15f5c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:47:38 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Releasing lock "refresh_cache-aecf1ba8-9675-4535-874b-9084361b7693" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:47:38 user nova-compute[70954]: DEBUG nova.compute.manager [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Instance network_info: |[{"id": "892719ba-88a5-4998-9b27-c47babc15f5c", "address": "fa:16:3e:26:33:fd", "network": {"id": "db5893fb-88b3-400d-8c55-c8f24d5b9084", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-271940369-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a3d2c4f2fb9f45559c4e51e86339a0e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap892719ba-88", "ovs_interfaceid": "892719ba-88a5-4998-9b27-c47babc15f5c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 10:47:38 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-3ae92414-403f-4130-a20a-d37934f0472b req-6b2fbb80-1a95-47e3-8dba-f1f536bb44ec service nova] Acquired lock "refresh_cache-aecf1ba8-9675-4535-874b-9084361b7693" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:47:38 user nova-compute[70954]: DEBUG nova.network.neutron [req-3ae92414-403f-4130-a20a-d37934f0472b req-6b2fbb80-1a95-47e3-8dba-f1f536bb44ec service nova] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Refreshing network info cache for port 892719ba-88a5-4998-9b27-c47babc15f5c {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 10:47:38 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Start _get_guest_xml network_info=[{"id": "892719ba-88a5-4998-9b27-c47babc15f5c", "address": "fa:16:3e:26:33:fd", "network": {"id": "db5893fb-88b3-400d-8c55-c8f24d5b9084", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-271940369-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a3d2c4f2fb9f45559c4e51e86339a0e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap892719ba-88", "ovs_interfaceid": "892719ba-88a5-4998-9b27-c47babc15f5c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'scsi', 'cdrom_bus': 'scsi', 'mapping': {'root': {'bus': 'scsi', 'dev': 'sda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'scsi', 'dev': 'sda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'scsi', 'dev': 'sdb', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:47:18Z,direct_url=,disk_format='qcow2',id=140913c9-4f31-4f27-b107-9d11bf6d2801,min_disk=0,min_ram=0,name='',owner='7fcbf21c326e4e4ab28b051c38b9da8e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:47:19Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/sda', 'image': [{'device_name': '/dev/sda', 'boot_index': 0, 'size': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'disk_bus': 'scsi', 'encryption_format': None, 'guest_format': None, 'image_id': '140913c9-4f31-4f27-b107-9d11bf6d2801'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 10:47:38 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:47:38 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:47:38 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70954) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 10:47:38 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T10:44:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:47:18Z,direct_url=,disk_format='qcow2',id=140913c9-4f31-4f27-b107-9d11bf6d2801,min_disk=0,min_ram=0,name='',owner='7fcbf21c326e4e4ab28b051c38b9da8e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:47:19Z,virtual_size=,visibility=), allow threads: True {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 10:47:38 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Flavor limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 10:47:38 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Image limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 10:47:38 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Flavor pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 10:47:38 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Image pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 10:47:38 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 10:47:38 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 10:47:38 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 10:47:38 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Got 1 possible topologies {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 10:47:38 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 10:47:38 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 10:47:38 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2023-04-21T10:47:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachSCSIVolumeTestJSON-server-1366625366',display_name='tempest-AttachSCSIVolumeTestJSON-server-1366625366',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachscsivolumetestjson-server-1366625366',id=6,image_ref='140913c9-4f31-4f27-b107-9d11bf6d2801',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBiXQMKehmXbZ+t/TYVT0gSKs0KmTZ5FlYz6zFGTYVIOj5Jx7gtGmebbyFsJUNZbybkDh6qpI1q+o00ju2IhzrS6d4GO5cz3RO8d1HNO4lgr/58RDDJYurBNqFihZhWt2A==',key_name='tempest-keypair-538215549',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a3d2c4f2fb9f45559c4e51e86339a0e0',ramdisk_id='',reservation_id='r-1n6qmth7',resources=None,root_device_name='/dev/sda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='140913c9-4f31-4f27-b107-9d11bf6d2801',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='scsi',image_hw_disk_bus='scsi',image_hw_machine_type='pc',image_hw_scsi_model='virtio-scsi',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachSCSIVolumeTestJSON-1586367620',owner_user_name='tempest-AttachSCSIVolumeTestJSON-1586367620-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:47:28Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c0b1fae0fc8d4b9a998ab0679bace1a1',uuid=aecf1ba8-9675-4535-874b-9084361b7693,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "892719ba-88a5-4998-9b27-c47babc15f5c", "address": "fa:16:3e:26:33:fd", "network": {"id": "db5893fb-88b3-400d-8c55-c8f24d5b9084", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-271940369-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a3d2c4f2fb9f45559c4e51e86339a0e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap892719ba-88", "ovs_interfaceid": "892719ba-88a5-4998-9b27-c47babc15f5c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70954) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 10:47:38 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Converting VIF {"id": "892719ba-88a5-4998-9b27-c47babc15f5c", "address": "fa:16:3e:26:33:fd", "network": {"id": "db5893fb-88b3-400d-8c55-c8f24d5b9084", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-271940369-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a3d2c4f2fb9f45559c4e51e86339a0e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap892719ba-88", "ovs_interfaceid": "892719ba-88a5-4998-9b27-c47babc15f5c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:47:38 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:33:fd,bridge_name='br-int',has_traffic_filtering=True,id=892719ba-88a5-4998-9b27-c47babc15f5c,network=Network(db5893fb-88b3-400d-8c55-c8f24d5b9084),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap892719ba-88') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:47:38 user nova-compute[70954]: DEBUG nova.objects.instance [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Lazy-loading 'pci_devices' on Instance uuid aecf1ba8-9675-4535-874b-9084361b7693 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:47:38 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] [instance: aecf1ba8-9675-4535-874b-9084361b7693] End _get_guest_xml xml= Apr 21 10:47:38 user nova-compute[70954]: aecf1ba8-9675-4535-874b-9084361b7693 Apr 21 10:47:38 user nova-compute[70954]: instance-00000006 Apr 21 10:47:38 user nova-compute[70954]: 131072 Apr 21 10:47:38 user nova-compute[70954]: 1 Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: tempest-AttachSCSIVolumeTestJSON-server-1366625366 Apr 21 10:47:38 user nova-compute[70954]: 2023-04-21 10:47:38 Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: 128 Apr 21 10:47:38 user nova-compute[70954]: 1 Apr 21 10:47:38 user nova-compute[70954]: 0 Apr 21 10:47:38 user nova-compute[70954]: 0 Apr 21 10:47:38 user nova-compute[70954]: 1 Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: tempest-AttachSCSIVolumeTestJSON-1586367620-project-member Apr 21 10:47:38 user nova-compute[70954]: tempest-AttachSCSIVolumeTestJSON-1586367620 Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: OpenStack Foundation Apr 21 10:47:38 user nova-compute[70954]: OpenStack Nova Apr 21 10:47:38 user nova-compute[70954]: 0.0.0 Apr 21 10:47:38 user nova-compute[70954]: aecf1ba8-9675-4535-874b-9084361b7693 Apr 21 10:47:38 user nova-compute[70954]: aecf1ba8-9675-4535-874b-9084361b7693 Apr 21 10:47:38 user nova-compute[70954]: Virtual Machine Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: hvm Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Nehalem Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]:
Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]:
Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: /dev/urandom Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: Apr 21 10:47:38 user nova-compute[70954]: {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 10:47:38 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2023-04-21T10:47:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachSCSIVolumeTestJSON-server-1366625366',display_name='tempest-AttachSCSIVolumeTestJSON-server-1366625366',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachscsivolumetestjson-server-1366625366',id=6,image_ref='140913c9-4f31-4f27-b107-9d11bf6d2801',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBiXQMKehmXbZ+t/TYVT0gSKs0KmTZ5FlYz6zFGTYVIOj5Jx7gtGmebbyFsJUNZbybkDh6qpI1q+o00ju2IhzrS6d4GO5cz3RO8d1HNO4lgr/58RDDJYurBNqFihZhWt2A==',key_name='tempest-keypair-538215549',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a3d2c4f2fb9f45559c4e51e86339a0e0',ramdisk_id='',reservation_id='r-1n6qmth7',resources=None,root_device_name='/dev/sda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='140913c9-4f31-4f27-b107-9d11bf6d2801',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='scsi',image_hw_disk_bus='scsi',image_hw_machine_type='pc',image_hw_scsi_model='virtio-scsi',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachSCSIVolumeTestJSON-1586367620',owner_user_name='tempest-AttachSCSIVolumeTestJSON-1586367620-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:47:28Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c0b1fae0fc8d4b9a998ab0679bace1a1',uuid=aecf1ba8-9675-4535-874b-9084361b7693,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "892719ba-88a5-4998-9b27-c47babc15f5c", "address": "fa:16:3e:26:33:fd", "network": {"id": "db5893fb-88b3-400d-8c55-c8f24d5b9084", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-271940369-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a3d2c4f2fb9f45559c4e51e86339a0e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap892719ba-88", "ovs_interfaceid": "892719ba-88a5-4998-9b27-c47babc15f5c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 10:47:38 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Converting VIF {"id": "892719ba-88a5-4998-9b27-c47babc15f5c", "address": "fa:16:3e:26:33:fd", "network": {"id": "db5893fb-88b3-400d-8c55-c8f24d5b9084", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-271940369-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a3d2c4f2fb9f45559c4e51e86339a0e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap892719ba-88", "ovs_interfaceid": "892719ba-88a5-4998-9b27-c47babc15f5c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:47:38 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:33:fd,bridge_name='br-int',has_traffic_filtering=True,id=892719ba-88a5-4998-9b27-c47babc15f5c,network=Network(db5893fb-88b3-400d-8c55-c8f24d5b9084),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap892719ba-88') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:47:38 user nova-compute[70954]: DEBUG os_vif [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:33:fd,bridge_name='br-int',has_traffic_filtering=True,id=892719ba-88a5-4998-9b27-c47babc15f5c,network=Network(db5893fb-88b3-400d-8c55-c8f24d5b9084),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap892719ba-88') {{(pid=70954) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 10:47:38 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:38 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:47:38 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 10:47:38 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:38 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap892719ba-88, may_exist=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:47:38 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap892719ba-88, col_values=(('external_ids', {'iface-id': '892719ba-88a5-4998-9b27-c47babc15f5c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:26:33:fd', 'vm-uuid': 'aecf1ba8-9675-4535-874b-9084361b7693'}),)) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:47:38 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:38 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:38 user nova-compute[70954]: INFO os_vif [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:33:fd,bridge_name='br-int',has_traffic_filtering=True,id=892719ba-88a5-4998-9b27-c47babc15f5c,network=Network(db5893fb-88b3-400d-8c55-c8f24d5b9084),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap892719ba-88') Apr 21 10:47:38 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] No BDM found with device name sda, not building metadata. {{(pid=70954) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 10:47:38 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] No BDM found with device name sdb, not building metadata. {{(pid=70954) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 10:47:38 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] No VIF found with MAC fa:16:3e:26:33:fd, not building metadata {{(pid=70954) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 10:47:38 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Using config drive Apr 21 10:47:39 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Creating config drive at /opt/stack/data/nova/instances/aecf1ba8-9675-4535-874b-9084361b7693/disk.config Apr 21 10:47:39 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Running cmd (subprocess): genisoimage -o /opt/stack/data/nova/instances/aecf1ba8-9675-4535-874b-9084361b7693/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 0.0.0 -quiet -J -r -V config-2 /tmp/tmpahrd7fqs {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:47:39 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] CMD "genisoimage -o /opt/stack/data/nova/instances/aecf1ba8-9675-4535-874b-9084361b7693/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 0.0.0 -quiet -J -r -V config-2 /tmp/tmpahrd7fqs" returned: 0 in 0.126s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:47:40 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:40 user nova-compute[70954]: DEBUG nova.network.neutron [req-3ae92414-403f-4130-a20a-d37934f0472b req-6b2fbb80-1a95-47e3-8dba-f1f536bb44ec service nova] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Updated VIF entry in instance network info cache for port 892719ba-88a5-4998-9b27-c47babc15f5c. {{(pid=70954) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 10:47:40 user nova-compute[70954]: DEBUG nova.network.neutron [req-3ae92414-403f-4130-a20a-d37934f0472b req-6b2fbb80-1a95-47e3-8dba-f1f536bb44ec service nova] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Updating instance_info_cache with network_info: [{"id": "892719ba-88a5-4998-9b27-c47babc15f5c", "address": "fa:16:3e:26:33:fd", "network": {"id": "db5893fb-88b3-400d-8c55-c8f24d5b9084", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-271940369-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a3d2c4f2fb9f45559c4e51e86339a0e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap892719ba-88", "ovs_interfaceid": "892719ba-88a5-4998-9b27-c47babc15f5c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:47:41 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-3ae92414-403f-4130-a20a-d37934f0472b req-6b2fbb80-1a95-47e3-8dba-f1f536bb44ec service nova] Releasing lock "refresh_cache-aecf1ba8-9675-4535-874b-9084361b7693" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:47:41 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:41 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:41 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:41 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:41 user nova-compute[70954]: DEBUG nova.network.neutron [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Successfully updated port: 44d4e2d5-0850-4b05-9d97-f3916611f340 {{(pid=70954) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 10:47:41 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Acquiring lock "refresh_cache-8ae797bd-c587-43a3-b941-e6d6d6c74e51" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:47:41 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Acquired lock "refresh_cache-8ae797bd-c587-43a3-b941-e6d6d6c74e51" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:47:41 user nova-compute[70954]: DEBUG nova.network.neutron [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Building network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 10:47:41 user nova-compute[70954]: DEBUG nova.compute.manager [req-0f3fbc4f-fa29-4d2f-a417-e74a6d8a601c req-a4de7bfc-6e6d-4031-b79c-3e16175c6efd service nova] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Received event network-changed-44d4e2d5-0850-4b05-9d97-f3916611f340 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:47:41 user nova-compute[70954]: DEBUG nova.compute.manager [req-0f3fbc4f-fa29-4d2f-a417-e74a6d8a601c req-a4de7bfc-6e6d-4031-b79c-3e16175c6efd service nova] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Refreshing instance network info cache due to event network-changed-44d4e2d5-0850-4b05-9d97-f3916611f340. {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 10:47:41 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-0f3fbc4f-fa29-4d2f-a417-e74a6d8a601c req-a4de7bfc-6e6d-4031-b79c-3e16175c6efd service nova] Acquiring lock "refresh_cache-8ae797bd-c587-43a3-b941-e6d6d6c74e51" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:47:41 user nova-compute[70954]: DEBUG nova.network.neutron [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Instance cache missing network info. {{(pid=70954) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 10:47:42 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:42 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:42 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:42 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:42 user nova-compute[70954]: DEBUG nova.network.neutron [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Updating instance_info_cache with network_info: [{"id": "44d4e2d5-0850-4b05-9d97-f3916611f340", "address": "fa:16:3e:d2:a2:e1", "network": {"id": "fcf7861e-296e-4706-871b-557b594e17c3", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-610768075-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "94e77e1735854e0c966c42e9a613017f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap44d4e2d5-08", "ovs_interfaceid": "44d4e2d5-0850-4b05-9d97-f3916611f340", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Releasing lock "refresh_cache-8ae797bd-c587-43a3-b941-e6d6d6c74e51" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG nova.compute.manager [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Instance network_info: |[{"id": "44d4e2d5-0850-4b05-9d97-f3916611f340", "address": "fa:16:3e:d2:a2:e1", "network": {"id": "fcf7861e-296e-4706-871b-557b594e17c3", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-610768075-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "94e77e1735854e0c966c42e9a613017f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap44d4e2d5-08", "ovs_interfaceid": "44d4e2d5-0850-4b05-9d97-f3916611f340", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-0f3fbc4f-fa29-4d2f-a417-e74a6d8a601c req-a4de7bfc-6e6d-4031-b79c-3e16175c6efd service nova] Acquired lock "refresh_cache-8ae797bd-c587-43a3-b941-e6d6d6c74e51" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG nova.network.neutron [req-0f3fbc4f-fa29-4d2f-a417-e74a6d8a601c req-a4de7bfc-6e6d-4031-b79c-3e16175c6efd service nova] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Refreshing network info cache for port 44d4e2d5-0850-4b05-9d97-f3916611f340 {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Start _get_guest_xml network_info=[{"id": "44d4e2d5-0850-4b05-9d97-f3916611f340", "address": "fa:16:3e:d2:a2:e1", "network": {"id": "fcf7861e-296e-4706-871b-557b594e17c3", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-610768075-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "94e77e1735854e0c966c42e9a613017f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap44d4e2d5-08", "ovs_interfaceid": "44d4e2d5-0850-4b05-9d97-f3916611f340", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:43:25Z,direct_url=,disk_format='qcow2',id=3b29a01a-1fc0-4d0d-89fb-23d22b2de02e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='a3109aa78f014d0da3638064a889676d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:43:26Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'size': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'disk_bus': 'virtio', 'encryption_format': None, 'guest_format': None, 'image_id': '3b29a01a-1fc0-4d0d-89fb-23d22b2de02e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 10:47:43 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:47:43 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:47:43 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70954) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T10:44:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:43:25Z,direct_url=,disk_format='qcow2',id=3b29a01a-1fc0-4d0d-89fb-23d22b2de02e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='a3109aa78f014d0da3638064a889676d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:43:26Z,virtual_size=,visibility=), allow threads: True {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Flavor limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Image limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Flavor pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Image pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Got 1 possible topologies {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:47:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-1501895670',display_name='tempest-VolumesAdminNegativeTest-server-1501895670',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-1501895670',id=7,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBzw1O+hOmmy5NgPW3bdNxqbqSrhNejvipkwcp0JQTVAJHNzFgSc6wLIdKA9lC+AU3ZJ2MAGprLUKfW+mBKTjT3fZH2AvICL2uFFTJNA7ynActmX3XPF5TRREc2oNq2DWg==',key_name='tempest-keypair-1275342497',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='94e77e1735854e0c966c42e9a613017f',ramdisk_id='',reservation_id='r-gh0werb9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-243340095',owner_user_name='tempest-VolumesAdminNegativeTest-243340095-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:47:34Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c600e01acfe140cabcdfe54958e66108',uuid=8ae797bd-c587-43a3-b941-e6d6d6c74e51,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "44d4e2d5-0850-4b05-9d97-f3916611f340", "address": "fa:16:3e:d2:a2:e1", "network": {"id": "fcf7861e-296e-4706-871b-557b594e17c3", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-610768075-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "94e77e1735854e0c966c42e9a613017f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap44d4e2d5-08", "ovs_interfaceid": "44d4e2d5-0850-4b05-9d97-f3916611f340", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70954) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Converting VIF {"id": "44d4e2d5-0850-4b05-9d97-f3916611f340", "address": "fa:16:3e:d2:a2:e1", "network": {"id": "fcf7861e-296e-4706-871b-557b594e17c3", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-610768075-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "94e77e1735854e0c966c42e9a613017f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap44d4e2d5-08", "ovs_interfaceid": "44d4e2d5-0850-4b05-9d97-f3916611f340", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d2:a2:e1,bridge_name='br-int',has_traffic_filtering=True,id=44d4e2d5-0850-4b05-9d97-f3916611f340,network=Network(fcf7861e-296e-4706-871b-557b594e17c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44d4e2d5-08') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG nova.objects.instance [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Lazy-loading 'pci_devices' on Instance uuid 8ae797bd-c587-43a3-b941-e6d6d6c74e51 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] End _get_guest_xml xml= Apr 21 10:47:43 user nova-compute[70954]: 8ae797bd-c587-43a3-b941-e6d6d6c74e51 Apr 21 10:47:43 user nova-compute[70954]: instance-00000007 Apr 21 10:47:43 user nova-compute[70954]: 131072 Apr 21 10:47:43 user nova-compute[70954]: 1 Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: tempest-VolumesAdminNegativeTest-server-1501895670 Apr 21 10:47:43 user nova-compute[70954]: 2023-04-21 10:47:43 Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: 128 Apr 21 10:47:43 user nova-compute[70954]: 1 Apr 21 10:47:43 user nova-compute[70954]: 0 Apr 21 10:47:43 user nova-compute[70954]: 0 Apr 21 10:47:43 user nova-compute[70954]: 1 Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: tempest-VolumesAdminNegativeTest-243340095-project-member Apr 21 10:47:43 user nova-compute[70954]: tempest-VolumesAdminNegativeTest-243340095 Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: OpenStack Foundation Apr 21 10:47:43 user nova-compute[70954]: OpenStack Nova Apr 21 10:47:43 user nova-compute[70954]: 0.0.0 Apr 21 10:47:43 user nova-compute[70954]: 8ae797bd-c587-43a3-b941-e6d6d6c74e51 Apr 21 10:47:43 user nova-compute[70954]: 8ae797bd-c587-43a3-b941-e6d6d6c74e51 Apr 21 10:47:43 user nova-compute[70954]: Virtual Machine Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: hvm Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Nehalem Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: /dev/urandom Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: Apr 21 10:47:43 user nova-compute[70954]: {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:47:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-1501895670',display_name='tempest-VolumesAdminNegativeTest-server-1501895670',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-1501895670',id=7,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBzw1O+hOmmy5NgPW3bdNxqbqSrhNejvipkwcp0JQTVAJHNzFgSc6wLIdKA9lC+AU3ZJ2MAGprLUKfW+mBKTjT3fZH2AvICL2uFFTJNA7ynActmX3XPF5TRREc2oNq2DWg==',key_name='tempest-keypair-1275342497',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='94e77e1735854e0c966c42e9a613017f',ramdisk_id='',reservation_id='r-gh0werb9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-243340095',owner_user_name='tempest-VolumesAdminNegativeTest-243340095-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:47:34Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c600e01acfe140cabcdfe54958e66108',uuid=8ae797bd-c587-43a3-b941-e6d6d6c74e51,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "44d4e2d5-0850-4b05-9d97-f3916611f340", "address": "fa:16:3e:d2:a2:e1", "network": {"id": "fcf7861e-296e-4706-871b-557b594e17c3", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-610768075-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "94e77e1735854e0c966c42e9a613017f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap44d4e2d5-08", "ovs_interfaceid": "44d4e2d5-0850-4b05-9d97-f3916611f340", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Converting VIF {"id": "44d4e2d5-0850-4b05-9d97-f3916611f340", "address": "fa:16:3e:d2:a2:e1", "network": {"id": "fcf7861e-296e-4706-871b-557b594e17c3", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-610768075-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "94e77e1735854e0c966c42e9a613017f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap44d4e2d5-08", "ovs_interfaceid": "44d4e2d5-0850-4b05-9d97-f3916611f340", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d2:a2:e1,bridge_name='br-int',has_traffic_filtering=True,id=44d4e2d5-0850-4b05-9d97-f3916611f340,network=Network(fcf7861e-296e-4706-871b-557b594e17c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44d4e2d5-08') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG os_vif [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d2:a2:e1,bridge_name='br-int',has_traffic_filtering=True,id=44d4e2d5-0850-4b05-9d97-f3916611f340,network=Network(fcf7861e-296e-4706-871b-557b594e17c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44d4e2d5-08') {{(pid=70954) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap44d4e2d5-08, may_exist=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap44d4e2d5-08, col_values=(('external_ids', {'iface-id': '44d4e2d5-0850-4b05-9d97-f3916611f340', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d2:a2:e1', 'vm-uuid': '8ae797bd-c587-43a3-b941-e6d6d6c74e51'}),)) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:43 user nova-compute[70954]: INFO os_vif [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d2:a2:e1,bridge_name='br-int',has_traffic_filtering=True,id=44d4e2d5-0850-4b05-9d97-f3916611f340,network=Network(fcf7861e-296e-4706-871b-557b594e17c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44d4e2d5-08') Apr 21 10:47:43 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] No BDM found with device name vda, not building metadata. {{(pid=70954) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] No VIF found with MAC fa:16:3e:d2:a2:e1, not building metadata {{(pid=70954) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG nova.compute.manager [req-9930cee4-45e7-4e0b-a715-88d3f29bc3e0 req-3674185d-2165-4027-b6ac-d37e2648ec1d service nova] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Received event network-vif-plugged-892719ba-88a5-4998-9b27-c47babc15f5c {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-9930cee4-45e7-4e0b-a715-88d3f29bc3e0 req-3674185d-2165-4027-b6ac-d37e2648ec1d service nova] Acquiring lock "aecf1ba8-9675-4535-874b-9084361b7693-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-9930cee4-45e7-4e0b-a715-88d3f29bc3e0 req-3674185d-2165-4027-b6ac-d37e2648ec1d service nova] Lock "aecf1ba8-9675-4535-874b-9084361b7693-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-9930cee4-45e7-4e0b-a715-88d3f29bc3e0 req-3674185d-2165-4027-b6ac-d37e2648ec1d service nova] Lock "aecf1ba8-9675-4535-874b-9084361b7693-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG nova.compute.manager [req-9930cee4-45e7-4e0b-a715-88d3f29bc3e0 req-3674185d-2165-4027-b6ac-d37e2648ec1d service nova] [instance: aecf1ba8-9675-4535-874b-9084361b7693] No waiting events found dispatching network-vif-plugged-892719ba-88a5-4998-9b27-c47babc15f5c {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:47:43 user nova-compute[70954]: WARNING nova.compute.manager [req-9930cee4-45e7-4e0b-a715-88d3f29bc3e0 req-3674185d-2165-4027-b6ac-d37e2648ec1d service nova] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Received unexpected event network-vif-plugged-892719ba-88a5-4998-9b27-c47babc15f5c for instance with vm_state building and task_state spawning. Apr 21 10:47:43 user nova-compute[70954]: DEBUG nova.compute.manager [req-9930cee4-45e7-4e0b-a715-88d3f29bc3e0 req-3674185d-2165-4027-b6ac-d37e2648ec1d service nova] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Received event network-vif-plugged-892719ba-88a5-4998-9b27-c47babc15f5c {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-9930cee4-45e7-4e0b-a715-88d3f29bc3e0 req-3674185d-2165-4027-b6ac-d37e2648ec1d service nova] Acquiring lock "aecf1ba8-9675-4535-874b-9084361b7693-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-9930cee4-45e7-4e0b-a715-88d3f29bc3e0 req-3674185d-2165-4027-b6ac-d37e2648ec1d service nova] Lock "aecf1ba8-9675-4535-874b-9084361b7693-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-9930cee4-45e7-4e0b-a715-88d3f29bc3e0 req-3674185d-2165-4027-b6ac-d37e2648ec1d service nova] Lock "aecf1ba8-9675-4535-874b-9084361b7693-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:43 user nova-compute[70954]: DEBUG nova.compute.manager [req-9930cee4-45e7-4e0b-a715-88d3f29bc3e0 req-3674185d-2165-4027-b6ac-d37e2648ec1d service nova] [instance: aecf1ba8-9675-4535-874b-9084361b7693] No waiting events found dispatching network-vif-plugged-892719ba-88a5-4998-9b27-c47babc15f5c {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:47:43 user nova-compute[70954]: WARNING nova.compute.manager [req-9930cee4-45e7-4e0b-a715-88d3f29bc3e0 req-3674185d-2165-4027-b6ac-d37e2648ec1d service nova] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Received unexpected event network-vif-plugged-892719ba-88a5-4998-9b27-c47babc15f5c for instance with vm_state building and task_state spawning. Apr 21 10:47:44 user nova-compute[70954]: DEBUG nova.network.neutron [req-0f3fbc4f-fa29-4d2f-a417-e74a6d8a601c req-a4de7bfc-6e6d-4031-b79c-3e16175c6efd service nova] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Updated VIF entry in instance network info cache for port 44d4e2d5-0850-4b05-9d97-f3916611f340. {{(pid=70954) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 10:47:44 user nova-compute[70954]: DEBUG nova.network.neutron [req-0f3fbc4f-fa29-4d2f-a417-e74a6d8a601c req-a4de7bfc-6e6d-4031-b79c-3e16175c6efd service nova] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Updating instance_info_cache with network_info: [{"id": "44d4e2d5-0850-4b05-9d97-f3916611f340", "address": "fa:16:3e:d2:a2:e1", "network": {"id": "fcf7861e-296e-4706-871b-557b594e17c3", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-610768075-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "94e77e1735854e0c966c42e9a613017f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap44d4e2d5-08", "ovs_interfaceid": "44d4e2d5-0850-4b05-9d97-f3916611f340", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:47:44 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-0f3fbc4f-fa29-4d2f-a417-e74a6d8a601c req-a4de7bfc-6e6d-4031-b79c-3e16175c6efd service nova] Releasing lock "refresh_cache-8ae797bd-c587-43a3-b941-e6d6d6c74e51" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:47:44 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Resumed> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:47:44 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: aecf1ba8-9675-4535-874b-9084361b7693] VM Resumed (Lifecycle Event) Apr 21 10:47:44 user nova-compute[70954]: DEBUG nova.compute.manager [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Instance event wait completed in 0 seconds for {{(pid=70954) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 10:47:44 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Guest created on hypervisor {{(pid=70954) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 10:47:44 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:47:44 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Instance spawned successfully. Apr 21 10:47:44 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Attempting to register defaults for the following image properties: ['hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 10:47:44 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:47:44 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Found default for hw_input_bus of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:47:44 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Found default for hw_pointer_model of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:47:44 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Found default for hw_video_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:47:44 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Found default for hw_vif_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:47:44 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: aecf1ba8-9675-4535-874b-9084361b7693] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 10:47:44 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Started> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:47:44 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: aecf1ba8-9675-4535-874b-9084361b7693] VM Started (Lifecycle Event) Apr 21 10:47:44 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:47:44 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:47:44 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: aecf1ba8-9675-4535-874b-9084361b7693] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 10:47:44 user nova-compute[70954]: INFO nova.compute.manager [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Took 16.04 seconds to spawn the instance on the hypervisor. Apr 21 10:47:44 user nova-compute[70954]: DEBUG nova.compute.manager [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:47:44 user nova-compute[70954]: INFO nova.compute.manager [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Took 17.26 seconds to build instance. Apr 21 10:47:44 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-725e0bf9-99e8-4876-a311-10c7b78ebe4b tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Lock "aecf1ba8-9675-4535-874b-9084361b7693" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 17.548s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:44 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:44 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:44 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:44 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:45 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:45 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:45 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:45 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:45 user nova-compute[70954]: DEBUG nova.compute.manager [req-5f30a128-f2e2-43aa-8b3f-169fe9d04c32 req-6ffec14e-e37b-4ce8-82a5-22d662a05604 service nova] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Received event network-vif-plugged-44d4e2d5-0850-4b05-9d97-f3916611f340 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:47:45 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-5f30a128-f2e2-43aa-8b3f-169fe9d04c32 req-6ffec14e-e37b-4ce8-82a5-22d662a05604 service nova] Acquiring lock "8ae797bd-c587-43a3-b941-e6d6d6c74e51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:45 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-5f30a128-f2e2-43aa-8b3f-169fe9d04c32 req-6ffec14e-e37b-4ce8-82a5-22d662a05604 service nova] Lock "8ae797bd-c587-43a3-b941-e6d6d6c74e51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:45 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-5f30a128-f2e2-43aa-8b3f-169fe9d04c32 req-6ffec14e-e37b-4ce8-82a5-22d662a05604 service nova] Lock "8ae797bd-c587-43a3-b941-e6d6d6c74e51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:45 user nova-compute[70954]: DEBUG nova.compute.manager [req-5f30a128-f2e2-43aa-8b3f-169fe9d04c32 req-6ffec14e-e37b-4ce8-82a5-22d662a05604 service nova] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] No waiting events found dispatching network-vif-plugged-44d4e2d5-0850-4b05-9d97-f3916611f340 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:47:45 user nova-compute[70954]: WARNING nova.compute.manager [req-5f30a128-f2e2-43aa-8b3f-169fe9d04c32 req-6ffec14e-e37b-4ce8-82a5-22d662a05604 service nova] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Received unexpected event network-vif-plugged-44d4e2d5-0850-4b05-9d97-f3916611f340 for instance with vm_state building and task_state spawning. Apr 21 10:47:46 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:47 user nova-compute[70954]: DEBUG nova.compute.manager [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Instance event wait completed in 0 seconds for {{(pid=70954) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 10:47:47 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Guest created on hypervisor {{(pid=70954) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 10:47:47 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Resumed> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:47:47 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] VM Resumed (Lifecycle Event) Apr 21 10:47:47 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Instance spawned successfully. Apr 21 10:47:47 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 10:47:47 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Found default for hw_cdrom_bus of ide {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:47:47 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Found default for hw_disk_bus of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:47:47 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Found default for hw_input_bus of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:47:47 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Found default for hw_pointer_model of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:47:47 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Found default for hw_video_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:47:47 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Found default for hw_vif_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:47:47 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:47:47 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:47:47 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 10:47:47 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Started> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:47:47 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] VM Started (Lifecycle Event) Apr 21 10:47:47 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:47:47 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:47:47 user nova-compute[70954]: INFO nova.compute.manager [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Took 13.24 seconds to spawn the instance on the hypervisor. Apr 21 10:47:47 user nova-compute[70954]: DEBUG nova.compute.manager [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:47:47 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 10:47:47 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:47 user nova-compute[70954]: INFO nova.compute.manager [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Took 14.43 seconds to build instance. Apr 21 10:47:47 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-50ced96b-a2aa-490c-8f59-c57f9c72851e tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Lock "8ae797bd-c587-43a3-b941-e6d6d6c74e51" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 14.589s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:48 user nova-compute[70954]: DEBUG nova.compute.manager [req-517b9baf-6bac-42f8-bf07-af43a796d93c req-a41ade6e-7804-4df4-a58e-7aa2068e2057 service nova] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Received event network-vif-plugged-44d4e2d5-0850-4b05-9d97-f3916611f340 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:47:48 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-517b9baf-6bac-42f8-bf07-af43a796d93c req-a41ade6e-7804-4df4-a58e-7aa2068e2057 service nova] Acquiring lock "8ae797bd-c587-43a3-b941-e6d6d6c74e51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:47:48 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-517b9baf-6bac-42f8-bf07-af43a796d93c req-a41ade6e-7804-4df4-a58e-7aa2068e2057 service nova] Lock "8ae797bd-c587-43a3-b941-e6d6d6c74e51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:47:48 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-517b9baf-6bac-42f8-bf07-af43a796d93c req-a41ade6e-7804-4df4-a58e-7aa2068e2057 service nova] Lock "8ae797bd-c587-43a3-b941-e6d6d6c74e51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:47:48 user nova-compute[70954]: DEBUG nova.compute.manager [req-517b9baf-6bac-42f8-bf07-af43a796d93c req-a41ade6e-7804-4df4-a58e-7aa2068e2057 service nova] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] No waiting events found dispatching network-vif-plugged-44d4e2d5-0850-4b05-9d97-f3916611f340 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:47:48 user nova-compute[70954]: WARNING nova.compute.manager [req-517b9baf-6bac-42f8-bf07-af43a796d93c req-a41ade6e-7804-4df4-a58e-7aa2068e2057 service nova] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Received unexpected event network-vif-plugged-44d4e2d5-0850-4b05-9d97-f3916611f340 for instance with vm_state active and task_state None. Apr 21 10:47:48 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:48 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:49 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:51 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:52 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:53 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:54 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:56 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:47:56 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Cleaning up deleted instances {{(pid=70954) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 21 10:47:56 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] There are 0 instances to clean {{(pid=70954) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 21 10:47:56 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:47:56 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Cleaning up deleted instances with incomplete migration {{(pid=70954) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 21 10:47:56 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:47:57 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:58 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:58 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:47:58 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:47:58 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:47:58 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Starting heal instance info cache {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 10:47:58 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Rebuilding the list of instances to heal {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 21 10:47:59 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "refresh_cache-84b55fc0-e748-4c05-97ad-a6994c0487d2" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:47:59 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquired lock "refresh_cache-84b55fc0-e748-4c05-97ad-a6994c0487d2" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:47:59 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Forcefully refreshing network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 21 10:47:59 user nova-compute[70954]: DEBUG nova.objects.instance [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lazy-loading 'info_cache' on Instance uuid 84b55fc0-e748-4c05-97ad-a6994c0487d2 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:47:59 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Updating instance_info_cache with network_info: [{"id": "2a49817a-aed1-49bd-96b6-36286ff71e1c", "address": "fa:16:3e:4f:2d:82", "network": {"id": "cfb4de90-44ea-486a-b5c4-c3b1111aa2bd", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1667019531-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "aad84a0e014f47ddaeaddc88bf16b0a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a49817a-ae", "ovs_interfaceid": "2a49817a-aed1-49bd-96b6-36286ff71e1c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:47:59 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Releasing lock "refresh_cache-84b55fc0-e748-4c05-97ad-a6994c0487d2" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:47:59 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Updated the network info_cache for instance {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 21 10:47:59 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:47:59 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:47:59 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:48:00 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:48:00 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager.update_available_resource {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:48:00 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:48:00 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:48:00 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:48:00 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Auditing locally available compute resources for user (node: user) {{(pid=70954) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 10:48:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15bf9321-a92e-4be2-bcae-a943988c811a/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:48:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15bf9321-a92e-4be2-bcae-a943988c811a/disk --force-share --output=json" returned: 0 in 0.187s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:48:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15bf9321-a92e-4be2-bcae-a943988c811a/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:48:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15bf9321-a92e-4be2-bcae-a943988c811a/disk --force-share --output=json" returned: 0 in 0.173s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:48:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dd4d15a1-3a71-49e8-9851-9b49fec6a9e3/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:48:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dd4d15a1-3a71-49e8-9851-9b49fec6a9e3/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:48:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dd4d15a1-3a71-49e8-9851-9b49fec6a9e3/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:48:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dd4d15a1-3a71-49e8-9851-9b49fec6a9e3/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:48:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/aecf1ba8-9675-4535-874b-9084361b7693/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:48:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/aecf1ba8-9675-4535-874b-9084361b7693/disk --force-share --output=json" returned: 0 in 0.148s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:48:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/aecf1ba8-9675-4535-874b-9084361b7693/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:48:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/aecf1ba8-9675-4535-874b-9084361b7693/disk --force-share --output=json" returned: 0 in 0.170s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:48:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:48:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8/disk --force-share --output=json" returned: 0 in 0.159s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:48:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:48:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8/disk --force-share --output=json" returned: 0 in 0.163s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:48:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8ae797bd-c587-43a3-b941-e6d6d6c74e51/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:48:02 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:48:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8ae797bd-c587-43a3-b941-e6d6d6c74e51/disk --force-share --output=json" returned: 0 in 0.174s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:48:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8ae797bd-c587-43a3-b941-e6d6d6c74e51/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:48:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8ae797bd-c587-43a3-b941-e6d6d6c74e51/disk --force-share --output=json" returned: 0 in 0.151s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:48:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:48:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Acquiring lock "c3100b42-46b3-4371-89f2-e511ca1ce6cd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:48:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Lock "c3100b42-46b3-4371-89f2-e511ca1ce6cd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:48:02 user nova-compute[70954]: DEBUG nova.compute.manager [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Starting instance... {{(pid=70954) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 10:48:03 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk --force-share --output=json" returned: 0 in 0.184s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:48:03 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:48:03 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:48:03 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.004s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:48:03 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70954) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 10:48:03 user nova-compute[70954]: INFO nova.compute.claims [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Claim successful on node user Apr 21 10:48:03 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:48:03 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk --force-share --output=json" returned: 0 in 0.166s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:48:03 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:48:03 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:48:03 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:48:03 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:48:04 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:48:04 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:48:04 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Hypervisor/Node resource view: name=user free_ram=7868MB free_disk=26.561981201171875GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70954) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 10:48:04 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:48:04 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:48:04 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:48:04 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.295s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:48:04 user nova-compute[70954]: DEBUG nova.compute.manager [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Start building networks asynchronously for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 10:48:04 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.066s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:48:04 user nova-compute[70954]: DEBUG nova.compute.manager [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Allocating IP information in the background. {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 10:48:04 user nova-compute[70954]: DEBUG nova.network.neutron [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] allocate_for_instance() {{(pid=70954) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 10:48:04 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 10:48:04 user nova-compute[70954]: DEBUG nova.compute.manager [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Start building block device mappings for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 10:48:04 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 84b55fc0-e748-4c05-97ad-a6994c0487d2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:48:04 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:48:04 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance f8609da3-c26d-482a-bc03-017baf4bce22 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:48:04 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 15bf9321-a92e-4be2-bcae-a943988c811a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:48:04 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance dd4d15a1-3a71-49e8-9851-9b49fec6a9e3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:48:04 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance aecf1ba8-9675-4535-874b-9084361b7693 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:48:04 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 8ae797bd-c587-43a3-b941-e6d6d6c74e51 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:48:04 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance c3100b42-46b3-4371-89f2-e511ca1ce6cd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:48:04 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Total usable vcpus: 12, total allocated vcpus: 8 {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 10:48:04 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Final resource view: name=user phys_ram=16023MB used_ram=1536MB phys_disk=40GB used_disk=8GB total_vcpus=12 used_vcpus=8 pci_stats=[] {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 10:48:04 user nova-compute[70954]: DEBUG nova.policy [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0d7234a4dcce4289a84f5060f546efb6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '52aece0e34a3451da50638e2930424e7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70954) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 10:48:04 user nova-compute[70954]: DEBUG nova.compute.manager [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Start spawning the instance on the hypervisor. {{(pid=70954) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 10:48:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Creating instance directory {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 10:48:04 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Creating image(s) Apr 21 10:48:04 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Acquiring lock "/opt/stack/data/nova/instances/c3100b42-46b3-4371-89f2-e511ca1ce6cd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:48:04 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Lock "/opt/stack/data/nova/instances/c3100b42-46b3-4371-89f2-e511ca1ce6cd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:48:04 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Lock "/opt/stack/data/nova/instances/c3100b42-46b3-4371-89f2-e511ca1ce6cd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:48:04 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:48:04 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.147s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:48:04 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Acquiring lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:48:04 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.003s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:48:04 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:48:04 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:48:04 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:48:04 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Compute_service record updated for user:user {{(pid=70954) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 10:48:04 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.516s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:48:04 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.152s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:48:04 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/c3100b42-46b3-4371-89f2-e511ca1ce6cd/disk 1073741824 {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:48:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/c3100b42-46b3-4371-89f2-e511ca1ce6cd/disk 1073741824" returned: 0 in 0.058s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:48:05 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.215s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:48:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:48:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.140s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:48:05 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Checking if we can resize image /opt/stack/data/nova/instances/c3100b42-46b3-4371-89f2-e511ca1ce6cd/disk. size=1073741824 {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 10:48:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c3100b42-46b3-4371-89f2-e511ca1ce6cd/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:48:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c3100b42-46b3-4371-89f2-e511ca1ce6cd/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:48:05 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Cannot resize image /opt/stack/data/nova/instances/c3100b42-46b3-4371-89f2-e511ca1ce6cd/disk to a smaller size. {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 10:48:05 user nova-compute[70954]: DEBUG nova.objects.instance [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Lazy-loading 'migration_context' on Instance uuid c3100b42-46b3-4371-89f2-e511ca1ce6cd {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:48:05 user nova-compute[70954]: DEBUG nova.network.neutron [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Successfully created port: be8924d4-464a-4572-8b2f-96b2f230297f {{(pid=70954) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 10:48:05 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Created local disks {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 10:48:05 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Ensure instance console log exists: /opt/stack/data/nova/instances/c3100b42-46b3-4371-89f2-e511ca1ce6cd/console.log {{(pid=70954) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 10:48:05 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:48:05 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:48:05 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:48:05 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:48:05 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:48:05 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70954) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG nova.network.neutron [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Successfully updated port: be8924d4-464a-4572-8b2f-96b2f230297f {{(pid=70954) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Acquiring lock "refresh_cache-c3100b42-46b3-4371-89f2-e511ca1ce6cd" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Acquired lock "refresh_cache-c3100b42-46b3-4371-89f2-e511ca1ce6cd" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG nova.network.neutron [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Building network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG nova.compute.manager [req-f07c9b91-9d61-42f9-9430-847c77d5b3ef req-2c6bd6e5-2fdf-4acb-bcfe-0e9f17ca7f7c service nova] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Received event network-changed-be8924d4-464a-4572-8b2f-96b2f230297f {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG nova.compute.manager [req-f07c9b91-9d61-42f9-9430-847c77d5b3ef req-2c6bd6e5-2fdf-4acb-bcfe-0e9f17ca7f7c service nova] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Refreshing instance network info cache due to event network-changed-be8924d4-464a-4572-8b2f-96b2f230297f. {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-f07c9b91-9d61-42f9-9430-847c77d5b3ef req-2c6bd6e5-2fdf-4acb-bcfe-0e9f17ca7f7c service nova] Acquiring lock "refresh_cache-c3100b42-46b3-4371-89f2-e511ca1ce6cd" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG nova.network.neutron [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Instance cache missing network info. {{(pid=70954) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG nova.network.neutron [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Updating instance_info_cache with network_info: [{"id": "be8924d4-464a-4572-8b2f-96b2f230297f", "address": "fa:16:3e:e5:eb:e1", "network": {"id": "624bf70c-30f1-41f5-b380-69af8cfb5fd6", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-675355002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "52aece0e34a3451da50638e2930424e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe8924d4-46", "ovs_interfaceid": "be8924d4-464a-4572-8b2f-96b2f230297f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Releasing lock "refresh_cache-c3100b42-46b3-4371-89f2-e511ca1ce6cd" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG nova.compute.manager [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Instance network_info: |[{"id": "be8924d4-464a-4572-8b2f-96b2f230297f", "address": "fa:16:3e:e5:eb:e1", "network": {"id": "624bf70c-30f1-41f5-b380-69af8cfb5fd6", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-675355002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "52aece0e34a3451da50638e2930424e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe8924d4-46", "ovs_interfaceid": "be8924d4-464a-4572-8b2f-96b2f230297f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-f07c9b91-9d61-42f9-9430-847c77d5b3ef req-2c6bd6e5-2fdf-4acb-bcfe-0e9f17ca7f7c service nova] Acquired lock "refresh_cache-c3100b42-46b3-4371-89f2-e511ca1ce6cd" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG nova.network.neutron [req-f07c9b91-9d61-42f9-9430-847c77d5b3ef req-2c6bd6e5-2fdf-4acb-bcfe-0e9f17ca7f7c service nova] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Refreshing network info cache for port be8924d4-464a-4572-8b2f-96b2f230297f {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Start _get_guest_xml network_info=[{"id": "be8924d4-464a-4572-8b2f-96b2f230297f", "address": "fa:16:3e:e5:eb:e1", "network": {"id": "624bf70c-30f1-41f5-b380-69af8cfb5fd6", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-675355002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "52aece0e34a3451da50638e2930424e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe8924d4-46", "ovs_interfaceid": "be8924d4-464a-4572-8b2f-96b2f230297f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:43:25Z,direct_url=,disk_format='qcow2',id=3b29a01a-1fc0-4d0d-89fb-23d22b2de02e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='a3109aa78f014d0da3638064a889676d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:43:26Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'size': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'disk_bus': 'virtio', 'encryption_format': None, 'guest_format': None, 'image_id': '3b29a01a-1fc0-4d0d-89fb-23d22b2de02e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 10:48:06 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:48:06 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:48:06 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70954) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T10:44:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:43:25Z,direct_url=,disk_format='qcow2',id=3b29a01a-1fc0-4d0d-89fb-23d22b2de02e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='a3109aa78f014d0da3638064a889676d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:43:26Z,virtual_size=,visibility=), allow threads: True {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Flavor limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Image limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Flavor pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Image pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Got 1 possible topologies {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:48:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SnapshotDataIntegrityTests-server-1967603297',display_name='tempest-SnapshotDataIntegrityTests-server-1967603297',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-snapshotdataintegritytests-server-1967603297',id=8,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOI+lxaYCiJ6Cakfh0d0/M3EQtNn7ayCKiNgO3J0ChauJCxkwfuH2I2Rjm736o1FW/bz/bZeZnhFBJEXMyBImhjphifTfaav3xqO3xhVAU45T4aDHcFYSZ0q9YF9LVtTWQ==',key_name='tempest-SnapshotDataIntegrityTests-1505495754',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='52aece0e34a3451da50638e2930424e7',ramdisk_id='',reservation_id='r-w2cfhfmy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-SnapshotDataIntegrityTests-816712956',owner_user_name='tempest-SnapshotDataIntegrityTests-816712956-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:48:05Z,user_data=None,user_id='0d7234a4dcce4289a84f5060f546efb6',uuid=c3100b42-46b3-4371-89f2-e511ca1ce6cd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "be8924d4-464a-4572-8b2f-96b2f230297f", "address": "fa:16:3e:e5:eb:e1", "network": {"id": "624bf70c-30f1-41f5-b380-69af8cfb5fd6", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-675355002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "52aece0e34a3451da50638e2930424e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe8924d4-46", "ovs_interfaceid": "be8924d4-464a-4572-8b2f-96b2f230297f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70954) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Converting VIF {"id": "be8924d4-464a-4572-8b2f-96b2f230297f", "address": "fa:16:3e:e5:eb:e1", "network": {"id": "624bf70c-30f1-41f5-b380-69af8cfb5fd6", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-675355002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "52aece0e34a3451da50638e2930424e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe8924d4-46", "ovs_interfaceid": "be8924d4-464a-4572-8b2f-96b2f230297f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:eb:e1,bridge_name='br-int',has_traffic_filtering=True,id=be8924d4-464a-4572-8b2f-96b2f230297f,network=Network(624bf70c-30f1-41f5-b380-69af8cfb5fd6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe8924d4-46') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG nova.objects.instance [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Lazy-loading 'pci_devices' on Instance uuid c3100b42-46b3-4371-89f2-e511ca1ce6cd {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] End _get_guest_xml xml= Apr 21 10:48:06 user nova-compute[70954]: c3100b42-46b3-4371-89f2-e511ca1ce6cd Apr 21 10:48:06 user nova-compute[70954]: instance-00000008 Apr 21 10:48:06 user nova-compute[70954]: 131072 Apr 21 10:48:06 user nova-compute[70954]: 1 Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: tempest-SnapshotDataIntegrityTests-server-1967603297 Apr 21 10:48:06 user nova-compute[70954]: 2023-04-21 10:48:06 Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: 128 Apr 21 10:48:06 user nova-compute[70954]: 1 Apr 21 10:48:06 user nova-compute[70954]: 0 Apr 21 10:48:06 user nova-compute[70954]: 0 Apr 21 10:48:06 user nova-compute[70954]: 1 Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: tempest-SnapshotDataIntegrityTests-816712956-project-member Apr 21 10:48:06 user nova-compute[70954]: tempest-SnapshotDataIntegrityTests-816712956 Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: OpenStack Foundation Apr 21 10:48:06 user nova-compute[70954]: OpenStack Nova Apr 21 10:48:06 user nova-compute[70954]: 0.0.0 Apr 21 10:48:06 user nova-compute[70954]: c3100b42-46b3-4371-89f2-e511ca1ce6cd Apr 21 10:48:06 user nova-compute[70954]: c3100b42-46b3-4371-89f2-e511ca1ce6cd Apr 21 10:48:06 user nova-compute[70954]: Virtual Machine Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: hvm Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Nehalem Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: /dev/urandom Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: Apr 21 10:48:06 user nova-compute[70954]: {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:48:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SnapshotDataIntegrityTests-server-1967603297',display_name='tempest-SnapshotDataIntegrityTests-server-1967603297',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-snapshotdataintegritytests-server-1967603297',id=8,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOI+lxaYCiJ6Cakfh0d0/M3EQtNn7ayCKiNgO3J0ChauJCxkwfuH2I2Rjm736o1FW/bz/bZeZnhFBJEXMyBImhjphifTfaav3xqO3xhVAU45T4aDHcFYSZ0q9YF9LVtTWQ==',key_name='tempest-SnapshotDataIntegrityTests-1505495754',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='52aece0e34a3451da50638e2930424e7',ramdisk_id='',reservation_id='r-w2cfhfmy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-SnapshotDataIntegrityTests-816712956',owner_user_name='tempest-SnapshotDataIntegrityTests-816712956-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:48:05Z,user_data=None,user_id='0d7234a4dcce4289a84f5060f546efb6',uuid=c3100b42-46b3-4371-89f2-e511ca1ce6cd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "be8924d4-464a-4572-8b2f-96b2f230297f", "address": "fa:16:3e:e5:eb:e1", "network": {"id": "624bf70c-30f1-41f5-b380-69af8cfb5fd6", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-675355002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "52aece0e34a3451da50638e2930424e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe8924d4-46", "ovs_interfaceid": "be8924d4-464a-4572-8b2f-96b2f230297f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Converting VIF {"id": "be8924d4-464a-4572-8b2f-96b2f230297f", "address": "fa:16:3e:e5:eb:e1", "network": {"id": "624bf70c-30f1-41f5-b380-69af8cfb5fd6", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-675355002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "52aece0e34a3451da50638e2930424e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe8924d4-46", "ovs_interfaceid": "be8924d4-464a-4572-8b2f-96b2f230297f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:eb:e1,bridge_name='br-int',has_traffic_filtering=True,id=be8924d4-464a-4572-8b2f-96b2f230297f,network=Network(624bf70c-30f1-41f5-b380-69af8cfb5fd6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe8924d4-46') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG os_vif [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:eb:e1,bridge_name='br-int',has_traffic_filtering=True,id=be8924d4-464a-4572-8b2f-96b2f230297f,network=Network(624bf70c-30f1-41f5-b380-69af8cfb5fd6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe8924d4-46') {{(pid=70954) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbe8924d4-46, may_exist=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbe8924d4-46, col_values=(('external_ids', {'iface-id': 'be8924d4-464a-4572-8b2f-96b2f230297f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e5:eb:e1', 'vm-uuid': 'c3100b42-46b3-4371-89f2-e511ca1ce6cd'}),)) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:48:06 user nova-compute[70954]: INFO os_vif [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:eb:e1,bridge_name='br-int',has_traffic_filtering=True,id=be8924d4-464a-4572-8b2f-96b2f230297f,network=Network(624bf70c-30f1-41f5-b380-69af8cfb5fd6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe8924d4-46') Apr 21 10:48:06 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] No BDM found with device name vda, not building metadata. {{(pid=70954) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 10:48:06 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] No VIF found with MAC fa:16:3e:e5:eb:e1, not building metadata {{(pid=70954) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 10:48:07 user nova-compute[70954]: DEBUG nova.network.neutron [req-f07c9b91-9d61-42f9-9430-847c77d5b3ef req-2c6bd6e5-2fdf-4acb-bcfe-0e9f17ca7f7c service nova] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Updated VIF entry in instance network info cache for port be8924d4-464a-4572-8b2f-96b2f230297f. {{(pid=70954) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 10:48:07 user nova-compute[70954]: DEBUG nova.network.neutron [req-f07c9b91-9d61-42f9-9430-847c77d5b3ef req-2c6bd6e5-2fdf-4acb-bcfe-0e9f17ca7f7c service nova] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Updating instance_info_cache with network_info: [{"id": "be8924d4-464a-4572-8b2f-96b2f230297f", "address": "fa:16:3e:e5:eb:e1", "network": {"id": "624bf70c-30f1-41f5-b380-69af8cfb5fd6", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-675355002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "52aece0e34a3451da50638e2930424e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe8924d4-46", "ovs_interfaceid": "be8924d4-464a-4572-8b2f-96b2f230297f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:48:07 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-f07c9b91-9d61-42f9-9430-847c77d5b3ef req-2c6bd6e5-2fdf-4acb-bcfe-0e9f17ca7f7c service nova] Releasing lock "refresh_cache-c3100b42-46b3-4371-89f2-e511ca1ce6cd" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:48:07 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:48:08 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:48:08 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:48:08 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:48:08 user nova-compute[70954]: DEBUG nova.compute.manager [req-93c3ddf9-daff-4e0a-916d-bc6cf30f1fe4 req-5cb1dc16-1f57-4c83-ac93-af894a4ac9cb service nova] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Received event network-vif-plugged-be8924d4-464a-4572-8b2f-96b2f230297f {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:48:08 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-93c3ddf9-daff-4e0a-916d-bc6cf30f1fe4 req-5cb1dc16-1f57-4c83-ac93-af894a4ac9cb service nova] Acquiring lock "c3100b42-46b3-4371-89f2-e511ca1ce6cd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:48:08 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-93c3ddf9-daff-4e0a-916d-bc6cf30f1fe4 req-5cb1dc16-1f57-4c83-ac93-af894a4ac9cb service nova] Lock "c3100b42-46b3-4371-89f2-e511ca1ce6cd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:48:08 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-93c3ddf9-daff-4e0a-916d-bc6cf30f1fe4 req-5cb1dc16-1f57-4c83-ac93-af894a4ac9cb service nova] Lock "c3100b42-46b3-4371-89f2-e511ca1ce6cd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:48:08 user nova-compute[70954]: DEBUG nova.compute.manager [req-93c3ddf9-daff-4e0a-916d-bc6cf30f1fe4 req-5cb1dc16-1f57-4c83-ac93-af894a4ac9cb service nova] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] No waiting events found dispatching network-vif-plugged-be8924d4-464a-4572-8b2f-96b2f230297f {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:48:08 user nova-compute[70954]: WARNING nova.compute.manager [req-93c3ddf9-daff-4e0a-916d-bc6cf30f1fe4 req-5cb1dc16-1f57-4c83-ac93-af894a4ac9cb service nova] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Received unexpected event network-vif-plugged-be8924d4-464a-4572-8b2f-96b2f230297f for instance with vm_state building and task_state spawning. Apr 21 10:48:08 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:48:08 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:48:08 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:48:08 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:48:10 user nova-compute[70954]: DEBUG nova.compute.manager [req-131f19ae-9241-4ce4-b881-b81ffffe89b9 req-32a91e1d-6cb7-40cf-b689-06e8f79d7f8d service nova] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Received event network-vif-plugged-be8924d4-464a-4572-8b2f-96b2f230297f {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:48:10 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-131f19ae-9241-4ce4-b881-b81ffffe89b9 req-32a91e1d-6cb7-40cf-b689-06e8f79d7f8d service nova] Acquiring lock "c3100b42-46b3-4371-89f2-e511ca1ce6cd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:48:10 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-131f19ae-9241-4ce4-b881-b81ffffe89b9 req-32a91e1d-6cb7-40cf-b689-06e8f79d7f8d service nova] Lock "c3100b42-46b3-4371-89f2-e511ca1ce6cd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:48:10 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-131f19ae-9241-4ce4-b881-b81ffffe89b9 req-32a91e1d-6cb7-40cf-b689-06e8f79d7f8d service nova] Lock "c3100b42-46b3-4371-89f2-e511ca1ce6cd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:48:10 user nova-compute[70954]: DEBUG nova.compute.manager [req-131f19ae-9241-4ce4-b881-b81ffffe89b9 req-32a91e1d-6cb7-40cf-b689-06e8f79d7f8d service nova] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] No waiting events found dispatching network-vif-plugged-be8924d4-464a-4572-8b2f-96b2f230297f {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:48:10 user nova-compute[70954]: WARNING nova.compute.manager [req-131f19ae-9241-4ce4-b881-b81ffffe89b9 req-32a91e1d-6cb7-40cf-b689-06e8f79d7f8d service nova] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Received unexpected event network-vif-plugged-be8924d4-464a-4572-8b2f-96b2f230297f for instance with vm_state building and task_state spawning. Apr 21 10:48:10 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Resumed> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:48:10 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] VM Resumed (Lifecycle Event) Apr 21 10:48:10 user nova-compute[70954]: DEBUG nova.compute.manager [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Instance event wait completed in 0 seconds for {{(pid=70954) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 10:48:10 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Guest created on hypervisor {{(pid=70954) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 10:48:10 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Instance spawned successfully. Apr 21 10:48:10 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 10:48:10 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:48:10 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Found default for hw_cdrom_bus of ide {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:48:10 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Found default for hw_disk_bus of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:48:10 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Found default for hw_input_bus of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:48:10 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Found default for hw_pointer_model of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:48:10 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Found default for hw_video_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:48:10 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Found default for hw_vif_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:48:10 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:48:10 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 10:48:10 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Started> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:48:10 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] VM Started (Lifecycle Event) Apr 21 10:48:10 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:48:10 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:48:10 user nova-compute[70954]: INFO nova.compute.manager [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Took 6.02 seconds to spawn the instance on the hypervisor. Apr 21 10:48:10 user nova-compute[70954]: DEBUG nova.compute.manager [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:48:10 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 10:48:10 user nova-compute[70954]: INFO nova.compute.manager [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Took 7.74 seconds to build instance. Apr 21 10:48:10 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1eec999c-a5bd-438a-ba6d-170d6fd47370 tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Lock "c3100b42-46b3-4371-89f2-e511ca1ce6cd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.877s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:48:11 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:48:12 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:48:16 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:48:17 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:48:17 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:48:19 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:48:21 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:48:22 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:48:23 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:48:26 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:48:26 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:48:27 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:48:31 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:48:32 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:48:32 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:48:35 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:48:36 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:48:37 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:48:41 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:48:42 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:48:46 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:48:51 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:48:52 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:48:53 user nova-compute[70954]: DEBUG nova.compute.manager [None req-9766eae8-9767-4d6e-a4c3-df3d5361d124 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:48:53 user nova-compute[70954]: INFO nova.compute.manager [None req-9766eae8-9767-4d6e-a4c3-df3d5361d124 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] instance snapshotting Apr 21 10:48:53 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-9766eae8-9767-4d6e-a4c3-df3d5361d124 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Beginning live snapshot process Apr 21 10:48:53 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-9766eae8-9767-4d6e-a4c3-df3d5361d124 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk --force-share --output=json -f qcow2 {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:48:53 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-9766eae8-9767-4d6e-a4c3-df3d5361d124 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk --force-share --output=json -f qcow2" returned: 0 in 0.176s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:48:53 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-9766eae8-9767-4d6e-a4c3-df3d5361d124 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk --force-share --output=json -f qcow2 {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:48:54 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-9766eae8-9767-4d6e-a4c3-df3d5361d124 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk --force-share --output=json -f qcow2" returned: 0 in 0.142s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:48:54 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-9766eae8-9767-4d6e-a4c3-df3d5361d124 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:48:54 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-9766eae8-9767-4d6e-a4c3-df3d5361d124 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.128s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:48:54 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-9766eae8-9767-4d6e-a4c3-df3d5361d124 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmpg550z90x/97c2510105474dddadbf530e8b35f327.delta 1073741824 {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:48:54 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-9766eae8-9767-4d6e-a4c3-df3d5361d124 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmpg550z90x/97c2510105474dddadbf530e8b35f327.delta 1073741824" returned: 0 in 0.052s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:48:54 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-9766eae8-9767-4d6e-a4c3-df3d5361d124 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Quiescing instance not available: QEMU guest agent is not enabled. Apr 21 10:48:54 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-9766eae8-9767-4d6e-a4c3-df3d5361d124 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Checking UEFI support for host arch (x86_64) {{(pid=70954) supports_uefi /opt/stack/nova/nova/virt/libvirt/host.py:1722}} Apr 21 10:48:54 user nova-compute[70954]: INFO nova.virt.libvirt.host [None req-9766eae8-9767-4d6e-a4c3-df3d5361d124 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] UEFI support detected Apr 21 10:48:55 user nova-compute[70954]: DEBUG nova.virt.libvirt.guest [None req-9766eae8-9767-4d6e-a4c3-df3d5361d124 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] COPY block job progress, current cursor: 0 final cursor: 43778048 {{(pid=70954) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 21 10:48:55 user nova-compute[70954]: DEBUG nova.virt.libvirt.guest [None req-9766eae8-9767-4d6e-a4c3-df3d5361d124 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] COPY block job progress, current cursor: 43778048 final cursor: 43778048 {{(pid=70954) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 21 10:48:55 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-9766eae8-9767-4d6e-a4c3-df3d5361d124 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Skipping quiescing instance: QEMU guest agent is not enabled. Apr 21 10:48:55 user nova-compute[70954]: DEBUG nova.privsep.utils [None req-9766eae8-9767-4d6e-a4c3-df3d5361d124 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=70954) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 21 10:48:55 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-9766eae8-9767-4d6e-a4c3-df3d5361d124 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmpg550z90x/97c2510105474dddadbf530e8b35f327.delta /opt/stack/data/nova/instances/snapshots/tmpg550z90x/97c2510105474dddadbf530e8b35f327 {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:48:56 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-9766eae8-9767-4d6e-a4c3-df3d5361d124 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmpg550z90x/97c2510105474dddadbf530e8b35f327.delta /opt/stack/data/nova/instances/snapshots/tmpg550z90x/97c2510105474dddadbf530e8b35f327" returned: 0 in 0.912s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:48:56 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-9766eae8-9767-4d6e-a4c3-df3d5361d124 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Snapshot extracted, beginning image upload Apr 21 10:48:56 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:48:57 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:48:58 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-9766eae8-9767-4d6e-a4c3-df3d5361d124 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Snapshot image upload complete Apr 21 10:48:58 user nova-compute[70954]: INFO nova.compute.manager [None req-9766eae8-9767-4d6e-a4c3-df3d5361d124 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Took 5.06 seconds to snapshot the instance on the hypervisor. Apr 21 10:48:58 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:48:58 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:48:58 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:48:59 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:48:59 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:48:59 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Starting heal instance info cache {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 10:48:59 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "refresh_cache-dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:48:59 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquired lock "refresh_cache-dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:48:59 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Forcefully refreshing network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 21 10:49:00 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Updating instance_info_cache with network_info: [{"id": "781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3", "address": "fa:16:3e:77:18:56", "network": {"id": "4ef6bc58-0f6a-4b52-a251-5871f0f7d2d1", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-295901106-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f12ec80f50254e5bbc5afd5470546c71", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap781dee4b-a8", "ovs_interfaceid": "781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:49:00 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Releasing lock "refresh_cache-dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:49:00 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Updated the network info_cache for instance {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 21 10:49:00 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:49:00 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:49:00 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager.update_available_resource {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:49:00 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:49:00 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:49:00 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:49:00 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Auditing locally available compute resources for user (node: user) {{(pid=70954) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 10:49:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15bf9321-a92e-4be2-bcae-a943988c811a/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:49:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15bf9321-a92e-4be2-bcae-a943988c811a/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:49:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15bf9321-a92e-4be2-bcae-a943988c811a/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:49:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15bf9321-a92e-4be2-bcae-a943988c811a/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:49:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dd4d15a1-3a71-49e8-9851-9b49fec6a9e3/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:49:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dd4d15a1-3a71-49e8-9851-9b49fec6a9e3/disk --force-share --output=json" returned: 0 in 0.128s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:49:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dd4d15a1-3a71-49e8-9851-9b49fec6a9e3/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:49:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dd4d15a1-3a71-49e8-9851-9b49fec6a9e3/disk --force-share --output=json" returned: 0 in 0.128s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:49:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:49:01 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:49:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:49:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:49:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:49:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk --force-share --output=json" returned: 0 in 0.127s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:49:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:49:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:49:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8ae797bd-c587-43a3-b941-e6d6d6c74e51/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:49:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8ae797bd-c587-43a3-b941-e6d6d6c74e51/disk --force-share --output=json" returned: 0 in 0.128s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:49:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8ae797bd-c587-43a3-b941-e6d6d6c74e51/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:49:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8ae797bd-c587-43a3-b941-e6d6d6c74e51/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:49:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:49:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8/disk --force-share --output=json" returned: 0 in 0.128s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:49:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:49:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:49:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c3100b42-46b3-4371-89f2-e511ca1ce6cd/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:49:02 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c3100b42-46b3-4371-89f2-e511ca1ce6cd/disk --force-share --output=json" returned: 0 in 0.149s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:49:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c3100b42-46b3-4371-89f2-e511ca1ce6cd/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:49:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c3100b42-46b3-4371-89f2-e511ca1ce6cd/disk --force-share --output=json" returned: 0 in 0.129s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:49:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/aecf1ba8-9675-4535-874b-9084361b7693/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:49:03 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/aecf1ba8-9675-4535-874b-9084361b7693/disk --force-share --output=json" returned: 0 in 0.129s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:49:03 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/aecf1ba8-9675-4535-874b-9084361b7693/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:49:03 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/aecf1ba8-9675-4535-874b-9084361b7693/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:49:03 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:49:03 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:49:03 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Hypervisor/Node resource view: name=user free_ram=8169MB free_disk=26.490798950195312GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70954) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 10:49:03 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:49:03 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:49:04 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 84b55fc0-e748-4c05-97ad-a6994c0487d2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:49:04 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:49:04 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance f8609da3-c26d-482a-bc03-017baf4bce22 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:49:04 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 15bf9321-a92e-4be2-bcae-a943988c811a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:49:04 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance dd4d15a1-3a71-49e8-9851-9b49fec6a9e3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:49:04 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance aecf1ba8-9675-4535-874b-9084361b7693 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:49:04 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 8ae797bd-c587-43a3-b941-e6d6d6c74e51 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:49:04 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance c3100b42-46b3-4371-89f2-e511ca1ce6cd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:49:04 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Total usable vcpus: 12, total allocated vcpus: 8 {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 10:49:04 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Final resource view: name=user phys_ram=16023MB used_ram=1536MB phys_disk=40GB used_disk=8GB total_vcpus=12 used_vcpus=8 pci_stats=[] {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 10:49:04 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Refreshing inventories for resource provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 21 10:49:04 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Updating ProviderTree inventory for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 21 10:49:04 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Updating inventory in ProviderTree for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 21 10:49:04 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Refreshing aggregate associations for resource provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97, aggregates: None {{(pid=70954) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 21 10:49:04 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Refreshing trait associations for resource provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97, traits: COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,HW_CPU_X86_SSE,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_USB,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT {{(pid=70954) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 21 10:49:04 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:49:04 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:49:04 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Compute_service record updated for user:user {{(pid=70954) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 10:49:04 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.625s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:49:05 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:49:05 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:49:05 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70954) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 10:49:05 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-066dc04f-b23d-46a0-a04c-dc1664ca24de tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Acquiring lock "dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:49:05 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-066dc04f-b23d-46a0-a04c-dc1664ca24de tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Lock "dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:49:05 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-066dc04f-b23d-46a0-a04c-dc1664ca24de tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Acquiring lock "dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:49:05 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-066dc04f-b23d-46a0-a04c-dc1664ca24de tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Lock "dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:49:05 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-066dc04f-b23d-46a0-a04c-dc1664ca24de tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Lock "dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:49:05 user nova-compute[70954]: INFO nova.compute.manager [None req-066dc04f-b23d-46a0-a04c-dc1664ca24de tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Terminating instance Apr 21 10:49:05 user nova-compute[70954]: DEBUG nova.compute.manager [None req-066dc04f-b23d-46a0-a04c-dc1664ca24de tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Start destroying the instance on the hypervisor. {{(pid=70954) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 10:49:05 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:05 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:05 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:06 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:06 user nova-compute[70954]: DEBUG nova.compute.manager [req-8358e0ea-2ace-4549-88a3-3fe4a5c37377 req-a35f1cb8-443e-4eec-aff9-58a00b3eca70 service nova] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Received event network-vif-unplugged-781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:49:06 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-8358e0ea-2ace-4549-88a3-3fe4a5c37377 req-a35f1cb8-443e-4eec-aff9-58a00b3eca70 service nova] Acquiring lock "dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:49:06 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-8358e0ea-2ace-4549-88a3-3fe4a5c37377 req-a35f1cb8-443e-4eec-aff9-58a00b3eca70 service nova] Lock "dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:49:06 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-8358e0ea-2ace-4549-88a3-3fe4a5c37377 req-a35f1cb8-443e-4eec-aff9-58a00b3eca70 service nova] Lock "dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:49:06 user nova-compute[70954]: DEBUG nova.compute.manager [req-8358e0ea-2ace-4549-88a3-3fe4a5c37377 req-a35f1cb8-443e-4eec-aff9-58a00b3eca70 service nova] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] No waiting events found dispatching network-vif-unplugged-781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:49:06 user nova-compute[70954]: DEBUG nova.compute.manager [req-8358e0ea-2ace-4549-88a3-3fe4a5c37377 req-a35f1cb8-443e-4eec-aff9-58a00b3eca70 service nova] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Received event network-vif-unplugged-781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3 for instance with task_state deleting. {{(pid=70954) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 10:49:06 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Instance destroyed successfully. Apr 21 10:49:06 user nova-compute[70954]: DEBUG nova.objects.instance [None req-066dc04f-b23d-46a0-a04c-dc1664ca24de tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Lazy-loading 'resources' on Instance uuid dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:49:06 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-066dc04f-b23d-46a0-a04c-dc1664ca24de tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:47:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-2137471025',display_name='tempest-DeleteServersTestJSON-server-2137471025',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-deleteserverstestjson-server-2137471025',id=2,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-21T10:47:27Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='f12ec80f50254e5bbc5afd5470546c71',ramdisk_id='',reservation_id='r-pdoopfn2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-DeleteServersTestJSON-1827381813',owner_user_name='tempest-DeleteServersTestJSON-1827381813-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T10:47:27Z,user_data=None,user_id='eb7625e4107240d5a92379ace66052fa',uuid=dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3", "address": "fa:16:3e:77:18:56", "network": {"id": "4ef6bc58-0f6a-4b52-a251-5871f0f7d2d1", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-295901106-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f12ec80f50254e5bbc5afd5470546c71", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap781dee4b-a8", "ovs_interfaceid": "781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 10:49:06 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-066dc04f-b23d-46a0-a04c-dc1664ca24de tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Converting VIF {"id": "781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3", "address": "fa:16:3e:77:18:56", "network": {"id": "4ef6bc58-0f6a-4b52-a251-5871f0f7d2d1", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-295901106-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f12ec80f50254e5bbc5afd5470546c71", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap781dee4b-a8", "ovs_interfaceid": "781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:49:06 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-066dc04f-b23d-46a0-a04c-dc1664ca24de tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:77:18:56,bridge_name='br-int',has_traffic_filtering=True,id=781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3,network=Network(4ef6bc58-0f6a-4b52-a251-5871f0f7d2d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap781dee4b-a8') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:49:06 user nova-compute[70954]: DEBUG os_vif [None req-066dc04f-b23d-46a0-a04c-dc1664ca24de tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:77:18:56,bridge_name='br-int',has_traffic_filtering=True,id=781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3,network=Network(4ef6bc58-0f6a-4b52-a251-5871f0f7d2d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap781dee4b-a8') {{(pid=70954) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 10:49:06 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:06 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap781dee4b-a8, bridge=br-int, if_exists=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:49:06 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:06 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:49:06 user nova-compute[70954]: INFO os_vif [None req-066dc04f-b23d-46a0-a04c-dc1664ca24de tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:77:18:56,bridge_name='br-int',has_traffic_filtering=True,id=781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3,network=Network(4ef6bc58-0f6a-4b52-a251-5871f0f7d2d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap781dee4b-a8') Apr 21 10:49:06 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-066dc04f-b23d-46a0-a04c-dc1664ca24de tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Deleting instance files /opt/stack/data/nova/instances/dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8_del Apr 21 10:49:06 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-066dc04f-b23d-46a0-a04c-dc1664ca24de tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Deletion of /opt/stack/data/nova/instances/dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8_del complete Apr 21 10:49:06 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:06 user nova-compute[70954]: INFO nova.compute.manager [None req-066dc04f-b23d-46a0-a04c-dc1664ca24de tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Took 0.91 seconds to destroy the instance on the hypervisor. Apr 21 10:49:06 user nova-compute[70954]: DEBUG oslo.service.loopingcall [None req-066dc04f-b23d-46a0-a04c-dc1664ca24de tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70954) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 10:49:06 user nova-compute[70954]: DEBUG nova.compute.manager [-] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Deallocating network for instance {{(pid=70954) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 10:49:06 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] deallocate_for_instance() {{(pid=70954) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 10:49:07 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Updating instance_info_cache with network_info: [] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:49:07 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Took 0.65 seconds to deallocate network for instance. Apr 21 10:49:07 user nova-compute[70954]: DEBUG nova.compute.manager [req-e5466b4e-2656-49aa-813d-fe92be761b5e req-e5134d6b-1973-474a-b226-7dfe8fe509c8 service nova] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Received event network-vif-deleted-781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:49:07 user nova-compute[70954]: INFO nova.compute.manager [req-e5466b4e-2656-49aa-813d-fe92be761b5e req-e5134d6b-1973-474a-b226-7dfe8fe509c8 service nova] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Neutron deleted interface 781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3; detaching it from the instance and deleting it from the info cache Apr 21 10:49:07 user nova-compute[70954]: DEBUG nova.network.neutron [req-e5466b4e-2656-49aa-813d-fe92be761b5e req-e5134d6b-1973-474a-b226-7dfe8fe509c8 service nova] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Updating instance_info_cache with network_info: [] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:49:07 user nova-compute[70954]: DEBUG nova.compute.manager [req-e5466b4e-2656-49aa-813d-fe92be761b5e req-e5134d6b-1973-474a-b226-7dfe8fe509c8 service nova] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Detach interface failed, port_id=781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3, reason: Instance dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8 could not be found. {{(pid=70954) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 21 10:49:07 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-066dc04f-b23d-46a0-a04c-dc1664ca24de tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:49:07 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-066dc04f-b23d-46a0-a04c-dc1664ca24de tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:49:07 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:07 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-066dc04f-b23d-46a0-a04c-dc1664ca24de tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:49:07 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-066dc04f-b23d-46a0-a04c-dc1664ca24de tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:49:07 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-066dc04f-b23d-46a0-a04c-dc1664ca24de tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.265s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:49:07 user nova-compute[70954]: INFO nova.scheduler.client.report [None req-066dc04f-b23d-46a0-a04c-dc1664ca24de tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Deleted allocations for instance dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8 Apr 21 10:49:07 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-066dc04f-b23d-46a0-a04c-dc1664ca24de tempest-DeleteServersTestJSON-1827381813 tempest-DeleteServersTestJSON-1827381813-project-member] Lock "dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.059s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:49:08 user nova-compute[70954]: DEBUG nova.compute.manager [req-3eaee887-5d95-4001-ae9a-6286f77a7b79 req-c2c82a93-81b1-4e31-a087-a7e01852df26 service nova] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Received event network-changed-f210779b-302b-4a17-8b57-07837ea54e12 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:49:08 user nova-compute[70954]: DEBUG nova.compute.manager [req-3eaee887-5d95-4001-ae9a-6286f77a7b79 req-c2c82a93-81b1-4e31-a087-a7e01852df26 service nova] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Refreshing instance network info cache due to event network-changed-f210779b-302b-4a17-8b57-07837ea54e12. {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 10:49:08 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-3eaee887-5d95-4001-ae9a-6286f77a7b79 req-c2c82a93-81b1-4e31-a087-a7e01852df26 service nova] Acquiring lock "refresh_cache-f8609da3-c26d-482a-bc03-017baf4bce22" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:49:08 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-3eaee887-5d95-4001-ae9a-6286f77a7b79 req-c2c82a93-81b1-4e31-a087-a7e01852df26 service nova] Acquired lock "refresh_cache-f8609da3-c26d-482a-bc03-017baf4bce22" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:49:08 user nova-compute[70954]: DEBUG nova.network.neutron [req-3eaee887-5d95-4001-ae9a-6286f77a7b79 req-c2c82a93-81b1-4e31-a087-a7e01852df26 service nova] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Refreshing network info cache for port f210779b-302b-4a17-8b57-07837ea54e12 {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 10:49:08 user nova-compute[70954]: DEBUG nova.compute.manager [req-b4ad13d5-bb0a-4ab6-8493-7f5cddb58081 req-d036aa55-1292-4ea3-92c1-dccdc95c6521 service nova] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Received event network-vif-plugged-781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:49:08 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-b4ad13d5-bb0a-4ab6-8493-7f5cddb58081 req-d036aa55-1292-4ea3-92c1-dccdc95c6521 service nova] Acquiring lock "dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:49:08 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-b4ad13d5-bb0a-4ab6-8493-7f5cddb58081 req-d036aa55-1292-4ea3-92c1-dccdc95c6521 service nova] Lock "dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:49:08 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-b4ad13d5-bb0a-4ab6-8493-7f5cddb58081 req-d036aa55-1292-4ea3-92c1-dccdc95c6521 service nova] Lock "dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:49:08 user nova-compute[70954]: DEBUG nova.compute.manager [req-b4ad13d5-bb0a-4ab6-8493-7f5cddb58081 req-d036aa55-1292-4ea3-92c1-dccdc95c6521 service nova] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] No waiting events found dispatching network-vif-plugged-781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:49:08 user nova-compute[70954]: WARNING nova.compute.manager [req-b4ad13d5-bb0a-4ab6-8493-7f5cddb58081 req-d036aa55-1292-4ea3-92c1-dccdc95c6521 service nova] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Received unexpected event network-vif-plugged-781dee4b-a8ca-4469-aa8c-a2c3c1bd21b3 for instance with vm_state deleted and task_state None. Apr 21 10:49:08 user nova-compute[70954]: DEBUG nova.network.neutron [req-3eaee887-5d95-4001-ae9a-6286f77a7b79 req-c2c82a93-81b1-4e31-a087-a7e01852df26 service nova] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Updated VIF entry in instance network info cache for port f210779b-302b-4a17-8b57-07837ea54e12. {{(pid=70954) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 10:49:08 user nova-compute[70954]: DEBUG nova.network.neutron [req-3eaee887-5d95-4001-ae9a-6286f77a7b79 req-c2c82a93-81b1-4e31-a087-a7e01852df26 service nova] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Updating instance_info_cache with network_info: [{"id": "f210779b-302b-4a17-8b57-07837ea54e12", "address": "fa:16:3e:c3:c6:d1", "network": {"id": "ba8e9ff2-e562-462e-a2fa-0d7f643da26c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-83296950-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.39", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "648163a728fc4b28b85a24e9198d356b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf210779b-30", "ovs_interfaceid": "f210779b-302b-4a17-8b57-07837ea54e12", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:49:08 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-3eaee887-5d95-4001-ae9a-6286f77a7b79 req-c2c82a93-81b1-4e31-a087-a7e01852df26 service nova] Releasing lock "refresh_cache-f8609da3-c26d-482a-bc03-017baf4bce22" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:49:11 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:11 user nova-compute[70954]: DEBUG nova.compute.manager [req-ad6474b0-7eb4-44ac-8387-32524acf0391 req-2b0be916-1019-45cc-9b1c-f646faf33254 service nova] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Received event network-changed-fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:49:11 user nova-compute[70954]: DEBUG nova.compute.manager [req-ad6474b0-7eb4-44ac-8387-32524acf0391 req-2b0be916-1019-45cc-9b1c-f646faf33254 service nova] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Refreshing instance network info cache due to event network-changed-fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d. {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 10:49:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-ad6474b0-7eb4-44ac-8387-32524acf0391 req-2b0be916-1019-45cc-9b1c-f646faf33254 service nova] Acquiring lock "refresh_cache-15bf9321-a92e-4be2-bcae-a943988c811a" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:49:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-ad6474b0-7eb4-44ac-8387-32524acf0391 req-2b0be916-1019-45cc-9b1c-f646faf33254 service nova] Acquired lock "refresh_cache-15bf9321-a92e-4be2-bcae-a943988c811a" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:49:11 user nova-compute[70954]: DEBUG nova.network.neutron [req-ad6474b0-7eb4-44ac-8387-32524acf0391 req-2b0be916-1019-45cc-9b1c-f646faf33254 service nova] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Refreshing network info cache for port fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 10:49:11 user nova-compute[70954]: DEBUG nova.compute.manager [req-a5a5bc16-5dc5-4876-9f5a-f30d6257994a req-2f86e8f8-749c-4413-936b-7eef9f22cdc1 service nova] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Received event network-changed-b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:49:11 user nova-compute[70954]: DEBUG nova.compute.manager [req-a5a5bc16-5dc5-4876-9f5a-f30d6257994a req-2f86e8f8-749c-4413-936b-7eef9f22cdc1 service nova] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Refreshing instance network info cache due to event network-changed-b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c. {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 10:49:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-a5a5bc16-5dc5-4876-9f5a-f30d6257994a req-2f86e8f8-749c-4413-936b-7eef9f22cdc1 service nova] Acquiring lock "refresh_cache-dd4d15a1-3a71-49e8-9851-9b49fec6a9e3" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:49:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-a5a5bc16-5dc5-4876-9f5a-f30d6257994a req-2f86e8f8-749c-4413-936b-7eef9f22cdc1 service nova] Acquired lock "refresh_cache-dd4d15a1-3a71-49e8-9851-9b49fec6a9e3" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:49:11 user nova-compute[70954]: DEBUG nova.network.neutron [req-a5a5bc16-5dc5-4876-9f5a-f30d6257994a req-2f86e8f8-749c-4413-936b-7eef9f22cdc1 service nova] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Refreshing network info cache for port b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 10:49:12 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:12 user nova-compute[70954]: DEBUG nova.compute.manager [None req-e4461d2c-0b1c-4a23-a7ab-75529a26ff4b tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:49:12 user nova-compute[70954]: INFO nova.compute.manager [None req-e4461d2c-0b1c-4a23-a7ab-75529a26ff4b tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] instance snapshotting Apr 21 10:49:12 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-e4461d2c-0b1c-4a23-a7ab-75529a26ff4b tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Beginning live snapshot process Apr 21 10:49:12 user nova-compute[70954]: DEBUG nova.network.neutron [req-ad6474b0-7eb4-44ac-8387-32524acf0391 req-2b0be916-1019-45cc-9b1c-f646faf33254 service nova] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Updated VIF entry in instance network info cache for port fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d. {{(pid=70954) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 10:49:12 user nova-compute[70954]: DEBUG nova.network.neutron [req-ad6474b0-7eb4-44ac-8387-32524acf0391 req-2b0be916-1019-45cc-9b1c-f646faf33254 service nova] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Updating instance_info_cache with network_info: [{"id": "fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d", "address": "fa:16:3e:60:6e:bf", "network": {"id": "ba9d5253-efcc-4b0a-8cda-778a5a337551", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-310377863-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "4bdd7a4ccfc340aa9c1b02c57f7a0e70", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfca8b6a6-fd", "ovs_interfaceid": "fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:49:12 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-ad6474b0-7eb4-44ac-8387-32524acf0391 req-2b0be916-1019-45cc-9b1c-f646faf33254 service nova] Releasing lock "refresh_cache-15bf9321-a92e-4be2-bcae-a943988c811a" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:49:12 user nova-compute[70954]: DEBUG nova.network.neutron [req-a5a5bc16-5dc5-4876-9f5a-f30d6257994a req-2f86e8f8-749c-4413-936b-7eef9f22cdc1 service nova] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Updated VIF entry in instance network info cache for port b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c. {{(pid=70954) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 10:49:12 user nova-compute[70954]: DEBUG nova.network.neutron [req-a5a5bc16-5dc5-4876-9f5a-f30d6257994a req-2f86e8f8-749c-4413-936b-7eef9f22cdc1 service nova] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Updating instance_info_cache with network_info: [{"id": "b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c", "address": "fa:16:3e:c4:4b:e7", "network": {"id": "b24b52ac-b8ab-493e-994c-c38d7c5c7089", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1354809025-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.2", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d85f51547e5244e495343281725fe320", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb08cd847-5a", "ovs_interfaceid": "b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:49:12 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-a5a5bc16-5dc5-4876-9f5a-f30d6257994a req-2f86e8f8-749c-4413-936b-7eef9f22cdc1 service nova] Releasing lock "refresh_cache-dd4d15a1-3a71-49e8-9851-9b49fec6a9e3" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:49:13 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-e4461d2c-0b1c-4a23-a7ab-75529a26ff4b tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15bf9321-a92e-4be2-bcae-a943988c811a/disk --force-share --output=json -f qcow2 {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:49:13 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-e4461d2c-0b1c-4a23-a7ab-75529a26ff4b tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15bf9321-a92e-4be2-bcae-a943988c811a/disk --force-share --output=json -f qcow2" returned: 0 in 0.129s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:49:13 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-e4461d2c-0b1c-4a23-a7ab-75529a26ff4b tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15bf9321-a92e-4be2-bcae-a943988c811a/disk --force-share --output=json -f qcow2 {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:49:13 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-e4461d2c-0b1c-4a23-a7ab-75529a26ff4b tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15bf9321-a92e-4be2-bcae-a943988c811a/disk --force-share --output=json -f qcow2" returned: 0 in 0.138s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:49:13 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-e4461d2c-0b1c-4a23-a7ab-75529a26ff4b tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:49:13 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-e4461d2c-0b1c-4a23-a7ab-75529a26ff4b tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.133s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:49:13 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-e4461d2c-0b1c-4a23-a7ab-75529a26ff4b tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmpm7sxvuzh/f737cded158f4d93ae9cc3b839c52f82.delta 1073741824 {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:49:13 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-e4461d2c-0b1c-4a23-a7ab-75529a26ff4b tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmpm7sxvuzh/f737cded158f4d93ae9cc3b839c52f82.delta 1073741824" returned: 0 in 0.049s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:49:13 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-e4461d2c-0b1c-4a23-a7ab-75529a26ff4b tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Quiescing instance not available: QEMU guest agent is not enabled. Apr 21 10:49:13 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-6cc952c3-df66-444a-8db5-4fa91baa8d1c tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Acquiring lock "dd4d15a1-3a71-49e8-9851-9b49fec6a9e3" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:49:13 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-6cc952c3-df66-444a-8db5-4fa91baa8d1c tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Lock "dd4d15a1-3a71-49e8-9851-9b49fec6a9e3" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:49:13 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-6cc952c3-df66-444a-8db5-4fa91baa8d1c tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Acquiring lock "dd4d15a1-3a71-49e8-9851-9b49fec6a9e3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:49:13 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-6cc952c3-df66-444a-8db5-4fa91baa8d1c tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Lock "dd4d15a1-3a71-49e8-9851-9b49fec6a9e3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:49:13 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-6cc952c3-df66-444a-8db5-4fa91baa8d1c tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Lock "dd4d15a1-3a71-49e8-9851-9b49fec6a9e3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:49:13 user nova-compute[70954]: INFO nova.compute.manager [None req-6cc952c3-df66-444a-8db5-4fa91baa8d1c tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Terminating instance Apr 21 10:49:13 user nova-compute[70954]: DEBUG nova.compute.manager [None req-6cc952c3-df66-444a-8db5-4fa91baa8d1c tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Start destroying the instance on the hypervisor. {{(pid=70954) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 10:49:13 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:13 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:13 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:13 user nova-compute[70954]: DEBUG nova.compute.manager [req-4a52db10-5aa4-4e70-a668-fab9ea05941a req-1f017dd4-1942-462a-847c-312d03659657 service nova] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Received event network-vif-unplugged-b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:49:13 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-4a52db10-5aa4-4e70-a668-fab9ea05941a req-1f017dd4-1942-462a-847c-312d03659657 service nova] Acquiring lock "dd4d15a1-3a71-49e8-9851-9b49fec6a9e3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:49:13 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-4a52db10-5aa4-4e70-a668-fab9ea05941a req-1f017dd4-1942-462a-847c-312d03659657 service nova] Lock "dd4d15a1-3a71-49e8-9851-9b49fec6a9e3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:49:13 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-4a52db10-5aa4-4e70-a668-fab9ea05941a req-1f017dd4-1942-462a-847c-312d03659657 service nova] Lock "dd4d15a1-3a71-49e8-9851-9b49fec6a9e3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:49:13 user nova-compute[70954]: DEBUG nova.compute.manager [req-4a52db10-5aa4-4e70-a668-fab9ea05941a req-1f017dd4-1942-462a-847c-312d03659657 service nova] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] No waiting events found dispatching network-vif-unplugged-b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:49:13 user nova-compute[70954]: DEBUG nova.compute.manager [req-4a52db10-5aa4-4e70-a668-fab9ea05941a req-1f017dd4-1942-462a-847c-312d03659657 service nova] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Received event network-vif-unplugged-b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c for instance with task_state deleting. {{(pid=70954) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 10:49:14 user nova-compute[70954]: DEBUG nova.virt.libvirt.guest [None req-e4461d2c-0b1c-4a23-a7ab-75529a26ff4b tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] COPY block job progress, current cursor: 0 final cursor: 43778048 {{(pid=70954) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 21 10:49:14 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Instance destroyed successfully. Apr 21 10:49:14 user nova-compute[70954]: DEBUG nova.objects.instance [None req-6cc952c3-df66-444a-8db5-4fa91baa8d1c tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Lazy-loading 'resources' on Instance uuid dd4d15a1-3a71-49e8-9851-9b49fec6a9e3 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:49:14 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-6cc952c3-df66-444a-8db5-4fa91baa8d1c tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:47:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-721132263',display_name='tempest-AttachVolumeTestJSON-server-721132263',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-721132263',id=5,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE61zuAXT232aH/KOTnubmgBMkuEfigCy73bZO4uuf2B23JR41s8cx2vf+RH51d7wxX9P1MtxP7zNqYI2bDeqdfZasdq2OLkldcjqDGH3vLtRM+8mAr7ZBtqN4SKtJs0UQ==',key_name='tempest-keypair-495885922',keypairs=,launch_index=0,launched_at=2023-04-21T10:47:37Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='d85f51547e5244e495343281725fe320',ramdisk_id='',reservation_id='r-0s4941kd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeTestJSON-2130575493',owner_user_name='tempest-AttachVolumeTestJSON-2130575493-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T10:47:38Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='25fb0d890b594080bb1bb99dd6294ff1',uuid=dd4d15a1-3a71-49e8-9851-9b49fec6a9e3,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c", "address": "fa:16:3e:c4:4b:e7", "network": {"id": "b24b52ac-b8ab-493e-994c-c38d7c5c7089", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1354809025-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.2", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d85f51547e5244e495343281725fe320", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb08cd847-5a", "ovs_interfaceid": "b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 10:49:14 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-6cc952c3-df66-444a-8db5-4fa91baa8d1c tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Converting VIF {"id": "b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c", "address": "fa:16:3e:c4:4b:e7", "network": {"id": "b24b52ac-b8ab-493e-994c-c38d7c5c7089", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1354809025-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.2", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d85f51547e5244e495343281725fe320", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb08cd847-5a", "ovs_interfaceid": "b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:49:14 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-6cc952c3-df66-444a-8db5-4fa91baa8d1c tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c4:4b:e7,bridge_name='br-int',has_traffic_filtering=True,id=b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c,network=Network(b24b52ac-b8ab-493e-994c-c38d7c5c7089),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb08cd847-5a') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:49:14 user nova-compute[70954]: DEBUG os_vif [None req-6cc952c3-df66-444a-8db5-4fa91baa8d1c tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c4:4b:e7,bridge_name='br-int',has_traffic_filtering=True,id=b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c,network=Network(b24b52ac-b8ab-493e-994c-c38d7c5c7089),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb08cd847-5a') {{(pid=70954) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 10:49:14 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:14 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb08cd847-5a, bridge=br-int, if_exists=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:49:14 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:14 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:49:14 user nova-compute[70954]: INFO os_vif [None req-6cc952c3-df66-444a-8db5-4fa91baa8d1c tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c4:4b:e7,bridge_name='br-int',has_traffic_filtering=True,id=b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c,network=Network(b24b52ac-b8ab-493e-994c-c38d7c5c7089),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb08cd847-5a') Apr 21 10:49:14 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-6cc952c3-df66-444a-8db5-4fa91baa8d1c tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Deleting instance files /opt/stack/data/nova/instances/dd4d15a1-3a71-49e8-9851-9b49fec6a9e3_del Apr 21 10:49:14 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-6cc952c3-df66-444a-8db5-4fa91baa8d1c tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Deletion of /opt/stack/data/nova/instances/dd4d15a1-3a71-49e8-9851-9b49fec6a9e3_del complete Apr 21 10:49:14 user nova-compute[70954]: INFO nova.compute.manager [None req-6cc952c3-df66-444a-8db5-4fa91baa8d1c tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Took 1.12 seconds to destroy the instance on the hypervisor. Apr 21 10:49:14 user nova-compute[70954]: DEBUG oslo.service.loopingcall [None req-6cc952c3-df66-444a-8db5-4fa91baa8d1c tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70954) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 10:49:14 user nova-compute[70954]: DEBUG nova.compute.manager [-] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Deallocating network for instance {{(pid=70954) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 10:49:14 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] deallocate_for_instance() {{(pid=70954) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 10:49:14 user nova-compute[70954]: DEBUG nova.virt.libvirt.guest [None req-e4461d2c-0b1c-4a23-a7ab-75529a26ff4b tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] COPY block job progress, current cursor: 43778048 final cursor: 43778048 {{(pid=70954) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 21 10:49:14 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-e4461d2c-0b1c-4a23-a7ab-75529a26ff4b tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Skipping quiescing instance: QEMU guest agent is not enabled. Apr 21 10:49:14 user nova-compute[70954]: DEBUG nova.privsep.utils [None req-e4461d2c-0b1c-4a23-a7ab-75529a26ff4b tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=70954) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 21 10:49:14 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-e4461d2c-0b1c-4a23-a7ab-75529a26ff4b tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmpm7sxvuzh/f737cded158f4d93ae9cc3b839c52f82.delta /opt/stack/data/nova/instances/snapshots/tmpm7sxvuzh/f737cded158f4d93ae9cc3b839c52f82 {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:49:15 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-e4461d2c-0b1c-4a23-a7ab-75529a26ff4b tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmpm7sxvuzh/f737cded158f4d93ae9cc3b839c52f82.delta /opt/stack/data/nova/instances/snapshots/tmpm7sxvuzh/f737cded158f4d93ae9cc3b839c52f82" returned: 0 in 0.611s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:49:15 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-e4461d2c-0b1c-4a23-a7ab-75529a26ff4b tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Snapshot extracted, beginning image upload Apr 21 10:49:15 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Updating instance_info_cache with network_info: [] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:49:15 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Took 0.84 seconds to deallocate network for instance. Apr 21 10:49:15 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-6cc952c3-df66-444a-8db5-4fa91baa8d1c tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:49:15 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-6cc952c3-df66-444a-8db5-4fa91baa8d1c tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:49:15 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-6cc952c3-df66-444a-8db5-4fa91baa8d1c tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:49:15 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-6cc952c3-df66-444a-8db5-4fa91baa8d1c tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:49:15 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-6cc952c3-df66-444a-8db5-4fa91baa8d1c tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.241s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:49:15 user nova-compute[70954]: INFO nova.scheduler.client.report [None req-6cc952c3-df66-444a-8db5-4fa91baa8d1c tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Deleted allocations for instance dd4d15a1-3a71-49e8-9851-9b49fec6a9e3 Apr 21 10:49:15 user nova-compute[70954]: DEBUG nova.compute.manager [req-65631f93-6c5f-4f17-b07d-2e8081d1e90f req-0e68a69a-a26c-493e-b355-cda22a7897c4 service nova] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Received event network-vif-plugged-b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:49:15 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-65631f93-6c5f-4f17-b07d-2e8081d1e90f req-0e68a69a-a26c-493e-b355-cda22a7897c4 service nova] Acquiring lock "dd4d15a1-3a71-49e8-9851-9b49fec6a9e3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:49:15 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-65631f93-6c5f-4f17-b07d-2e8081d1e90f req-0e68a69a-a26c-493e-b355-cda22a7897c4 service nova] Lock "dd4d15a1-3a71-49e8-9851-9b49fec6a9e3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:49:15 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-65631f93-6c5f-4f17-b07d-2e8081d1e90f req-0e68a69a-a26c-493e-b355-cda22a7897c4 service nova] Lock "dd4d15a1-3a71-49e8-9851-9b49fec6a9e3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:49:15 user nova-compute[70954]: DEBUG nova.compute.manager [req-65631f93-6c5f-4f17-b07d-2e8081d1e90f req-0e68a69a-a26c-493e-b355-cda22a7897c4 service nova] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] No waiting events found dispatching network-vif-plugged-b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:49:15 user nova-compute[70954]: WARNING nova.compute.manager [req-65631f93-6c5f-4f17-b07d-2e8081d1e90f req-0e68a69a-a26c-493e-b355-cda22a7897c4 service nova] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Received unexpected event network-vif-plugged-b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c for instance with vm_state deleted and task_state None. Apr 21 10:49:15 user nova-compute[70954]: DEBUG nova.compute.manager [req-65631f93-6c5f-4f17-b07d-2e8081d1e90f req-0e68a69a-a26c-493e-b355-cda22a7897c4 service nova] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Received event network-vif-deleted-b08cd847-5a3c-4ebf-ac8d-0a8dfd13f57c {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:49:15 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-6cc952c3-df66-444a-8db5-4fa91baa8d1c tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Lock "dd4d15a1-3a71-49e8-9851-9b49fec6a9e3" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.443s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:49:17 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-e4461d2c-0b1c-4a23-a7ab-75529a26ff4b tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Snapshot image upload complete Apr 21 10:49:17 user nova-compute[70954]: INFO nova.compute.manager [None req-e4461d2c-0b1c-4a23-a7ab-75529a26ff4b tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Took 4.83 seconds to snapshot the instance on the hypervisor. Apr 21 10:49:17 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:18 user nova-compute[70954]: DEBUG nova.compute.manager [req-ca600245-9ce5-48fd-ab27-b617abe09bda req-774dc6a0-e9ee-46f0-8949-f713ba01f283 service nova] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Received event network-changed-892719ba-88a5-4998-9b27-c47babc15f5c {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:49:18 user nova-compute[70954]: DEBUG nova.compute.manager [req-ca600245-9ce5-48fd-ab27-b617abe09bda req-774dc6a0-e9ee-46f0-8949-f713ba01f283 service nova] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Refreshing instance network info cache due to event network-changed-892719ba-88a5-4998-9b27-c47babc15f5c. {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 10:49:18 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-ca600245-9ce5-48fd-ab27-b617abe09bda req-774dc6a0-e9ee-46f0-8949-f713ba01f283 service nova] Acquiring lock "refresh_cache-aecf1ba8-9675-4535-874b-9084361b7693" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:49:18 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-ca600245-9ce5-48fd-ab27-b617abe09bda req-774dc6a0-e9ee-46f0-8949-f713ba01f283 service nova] Acquired lock "refresh_cache-aecf1ba8-9675-4535-874b-9084361b7693" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:49:18 user nova-compute[70954]: DEBUG nova.network.neutron [req-ca600245-9ce5-48fd-ab27-b617abe09bda req-774dc6a0-e9ee-46f0-8949-f713ba01f283 service nova] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Refreshing network info cache for port 892719ba-88a5-4998-9b27-c47babc15f5c {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 10:49:19 user nova-compute[70954]: DEBUG nova.network.neutron [req-ca600245-9ce5-48fd-ab27-b617abe09bda req-774dc6a0-e9ee-46f0-8949-f713ba01f283 service nova] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Updated VIF entry in instance network info cache for port 892719ba-88a5-4998-9b27-c47babc15f5c. {{(pid=70954) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 10:49:19 user nova-compute[70954]: DEBUG nova.network.neutron [req-ca600245-9ce5-48fd-ab27-b617abe09bda req-774dc6a0-e9ee-46f0-8949-f713ba01f283 service nova] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Updating instance_info_cache with network_info: [{"id": "892719ba-88a5-4998-9b27-c47babc15f5c", "address": "fa:16:3e:26:33:fd", "network": {"id": "db5893fb-88b3-400d-8c55-c8f24d5b9084", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-271940369-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a3d2c4f2fb9f45559c4e51e86339a0e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap892719ba-88", "ovs_interfaceid": "892719ba-88a5-4998-9b27-c47babc15f5c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:49:19 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-ca600245-9ce5-48fd-ab27-b617abe09bda req-774dc6a0-e9ee-46f0-8949-f713ba01f283 service nova] Releasing lock "refresh_cache-aecf1ba8-9675-4535-874b-9084361b7693" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:49:19 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:20 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-c4065aa8-1fcb-406a-8b42-fc30bf71de4c tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Acquiring lock "aecf1ba8-9675-4535-874b-9084361b7693" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:49:20 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-c4065aa8-1fcb-406a-8b42-fc30bf71de4c tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Lock "aecf1ba8-9675-4535-874b-9084361b7693" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:49:20 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-c4065aa8-1fcb-406a-8b42-fc30bf71de4c tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Acquiring lock "aecf1ba8-9675-4535-874b-9084361b7693-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:49:20 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-c4065aa8-1fcb-406a-8b42-fc30bf71de4c tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Lock "aecf1ba8-9675-4535-874b-9084361b7693-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:49:20 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-c4065aa8-1fcb-406a-8b42-fc30bf71de4c tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Lock "aecf1ba8-9675-4535-874b-9084361b7693-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:49:20 user nova-compute[70954]: INFO nova.compute.manager [None req-c4065aa8-1fcb-406a-8b42-fc30bf71de4c tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Terminating instance Apr 21 10:49:20 user nova-compute[70954]: DEBUG nova.compute.manager [None req-c4065aa8-1fcb-406a-8b42-fc30bf71de4c tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Start destroying the instance on the hypervisor. {{(pid=70954) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 10:49:20 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:20 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:20 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:20 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:20 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:20 user nova-compute[70954]: DEBUG nova.compute.manager [req-b5899609-ace2-4e53-a931-09878a03fc57 req-c3df1ddc-814b-4ced-8068-05d87f720a37 service nova] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Received event network-vif-unplugged-892719ba-88a5-4998-9b27-c47babc15f5c {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:49:20 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-b5899609-ace2-4e53-a931-09878a03fc57 req-c3df1ddc-814b-4ced-8068-05d87f720a37 service nova] Acquiring lock "aecf1ba8-9675-4535-874b-9084361b7693-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:49:20 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-b5899609-ace2-4e53-a931-09878a03fc57 req-c3df1ddc-814b-4ced-8068-05d87f720a37 service nova] Lock "aecf1ba8-9675-4535-874b-9084361b7693-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:49:20 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-b5899609-ace2-4e53-a931-09878a03fc57 req-c3df1ddc-814b-4ced-8068-05d87f720a37 service nova] Lock "aecf1ba8-9675-4535-874b-9084361b7693-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:49:20 user nova-compute[70954]: DEBUG nova.compute.manager [req-b5899609-ace2-4e53-a931-09878a03fc57 req-c3df1ddc-814b-4ced-8068-05d87f720a37 service nova] [instance: aecf1ba8-9675-4535-874b-9084361b7693] No waiting events found dispatching network-vif-unplugged-892719ba-88a5-4998-9b27-c47babc15f5c {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:49:20 user nova-compute[70954]: DEBUG nova.compute.manager [req-b5899609-ace2-4e53-a931-09878a03fc57 req-c3df1ddc-814b-4ced-8068-05d87f720a37 service nova] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Received event network-vif-unplugged-892719ba-88a5-4998-9b27-c47babc15f5c for instance with task_state deleting. {{(pid=70954) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 10:49:20 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:20 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:20 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:20 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:21 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Instance destroyed successfully. Apr 21 10:49:21 user nova-compute[70954]: DEBUG nova.objects.instance [None req-c4065aa8-1fcb-406a-8b42-fc30bf71de4c tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Lazy-loading 'resources' on Instance uuid aecf1ba8-9675-4535-874b-9084361b7693 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:49:21 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-c4065aa8-1fcb-406a-8b42-fc30bf71de4c tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2023-04-21T10:47:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachSCSIVolumeTestJSON-server-1366625366',display_name='tempest-AttachSCSIVolumeTestJSON-server-1366625366',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachscsivolumetestjson-server-1366625366',id=6,image_ref='140913c9-4f31-4f27-b107-9d11bf6d2801',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBiXQMKehmXbZ+t/TYVT0gSKs0KmTZ5FlYz6zFGTYVIOj5Jx7gtGmebbyFsJUNZbybkDh6qpI1q+o00ju2IhzrS6d4GO5cz3RO8d1HNO4lgr/58RDDJYurBNqFihZhWt2A==',key_name='tempest-keypair-538215549',keypairs=,launch_index=0,launched_at=2023-04-21T10:47:44Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='a3d2c4f2fb9f45559c4e51e86339a0e0',ramdisk_id='',reservation_id='r-1n6qmth7',resources=None,root_device_name='/dev/sda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='140913c9-4f31-4f27-b107-9d11bf6d2801',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='scsi',image_hw_disk_bus='scsi',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_scsi_model='virtio-scsi',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachSCSIVolumeTestJSON-1586367620',owner_user_name='tempest-AttachSCSIVolumeTestJSON-1586367620-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T10:47:44Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c0b1fae0fc8d4b9a998ab0679bace1a1',uuid=aecf1ba8-9675-4535-874b-9084361b7693,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "892719ba-88a5-4998-9b27-c47babc15f5c", "address": "fa:16:3e:26:33:fd", "network": {"id": "db5893fb-88b3-400d-8c55-c8f24d5b9084", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-271940369-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a3d2c4f2fb9f45559c4e51e86339a0e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap892719ba-88", "ovs_interfaceid": "892719ba-88a5-4998-9b27-c47babc15f5c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 10:49:21 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-c4065aa8-1fcb-406a-8b42-fc30bf71de4c tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Converting VIF {"id": "892719ba-88a5-4998-9b27-c47babc15f5c", "address": "fa:16:3e:26:33:fd", "network": {"id": "db5893fb-88b3-400d-8c55-c8f24d5b9084", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-271940369-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a3d2c4f2fb9f45559c4e51e86339a0e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap892719ba-88", "ovs_interfaceid": "892719ba-88a5-4998-9b27-c47babc15f5c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:49:21 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-c4065aa8-1fcb-406a-8b42-fc30bf71de4c tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:26:33:fd,bridge_name='br-int',has_traffic_filtering=True,id=892719ba-88a5-4998-9b27-c47babc15f5c,network=Network(db5893fb-88b3-400d-8c55-c8f24d5b9084),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap892719ba-88') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:49:21 user nova-compute[70954]: DEBUG os_vif [None req-c4065aa8-1fcb-406a-8b42-fc30bf71de4c tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:26:33:fd,bridge_name='br-int',has_traffic_filtering=True,id=892719ba-88a5-4998-9b27-c47babc15f5c,network=Network(db5893fb-88b3-400d-8c55-c8f24d5b9084),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap892719ba-88') {{(pid=70954) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 10:49:21 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:21 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap892719ba-88, bridge=br-int, if_exists=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:49:21 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:21 user nova-compute[70954]: INFO os_vif [None req-c4065aa8-1fcb-406a-8b42-fc30bf71de4c tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:26:33:fd,bridge_name='br-int',has_traffic_filtering=True,id=892719ba-88a5-4998-9b27-c47babc15f5c,network=Network(db5893fb-88b3-400d-8c55-c8f24d5b9084),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap892719ba-88') Apr 21 10:49:21 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-c4065aa8-1fcb-406a-8b42-fc30bf71de4c tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Deleting instance files /opt/stack/data/nova/instances/aecf1ba8-9675-4535-874b-9084361b7693_del Apr 21 10:49:21 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-c4065aa8-1fcb-406a-8b42-fc30bf71de4c tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Deletion of /opt/stack/data/nova/instances/aecf1ba8-9675-4535-874b-9084361b7693_del complete Apr 21 10:49:21 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:21 user nova-compute[70954]: INFO nova.compute.manager [None req-c4065aa8-1fcb-406a-8b42-fc30bf71de4c tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Took 0.70 seconds to destroy the instance on the hypervisor. Apr 21 10:49:21 user nova-compute[70954]: DEBUG oslo.service.loopingcall [None req-c4065aa8-1fcb-406a-8b42-fc30bf71de4c tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70954) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 10:49:21 user nova-compute[70954]: DEBUG nova.compute.manager [-] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Deallocating network for instance {{(pid=70954) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 10:49:21 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: aecf1ba8-9675-4535-874b-9084361b7693] deallocate_for_instance() {{(pid=70954) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 10:49:21 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:21 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:21 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:21 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:21 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:21 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:21 user nova-compute[70954]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:49:21 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] VM Stopped (Lifecycle Event) Apr 21 10:49:21 user nova-compute[70954]: DEBUG nova.compute.manager [None req-6235624a-3701-4f17-815c-c6dfbc1897dd None None] [instance: dd34ae7e-dcf2-4bb2-8ea3-0a3ee553efd8] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:49:22 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Updating instance_info_cache with network_info: [] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:49:22 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Took 1.35 seconds to deallocate network for instance. Apr 21 10:49:22 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-c4065aa8-1fcb-406a-8b42-fc30bf71de4c tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:49:22 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-c4065aa8-1fcb-406a-8b42-fc30bf71de4c tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:49:22 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:22 user nova-compute[70954]: DEBUG nova.compute.manager [req-0855b1d6-74e3-4042-9ed0-718bd0f8537c req-c4911d29-111c-4c5d-add7-02176e4df01f service nova] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Received event network-vif-plugged-892719ba-88a5-4998-9b27-c47babc15f5c {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:49:22 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-0855b1d6-74e3-4042-9ed0-718bd0f8537c req-c4911d29-111c-4c5d-add7-02176e4df01f service nova] Acquiring lock "aecf1ba8-9675-4535-874b-9084361b7693-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:49:22 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-0855b1d6-74e3-4042-9ed0-718bd0f8537c req-c4911d29-111c-4c5d-add7-02176e4df01f service nova] Lock "aecf1ba8-9675-4535-874b-9084361b7693-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:49:22 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-0855b1d6-74e3-4042-9ed0-718bd0f8537c req-c4911d29-111c-4c5d-add7-02176e4df01f service nova] Lock "aecf1ba8-9675-4535-874b-9084361b7693-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:49:22 user nova-compute[70954]: DEBUG nova.compute.manager [req-0855b1d6-74e3-4042-9ed0-718bd0f8537c req-c4911d29-111c-4c5d-add7-02176e4df01f service nova] [instance: aecf1ba8-9675-4535-874b-9084361b7693] No waiting events found dispatching network-vif-plugged-892719ba-88a5-4998-9b27-c47babc15f5c {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:49:22 user nova-compute[70954]: WARNING nova.compute.manager [req-0855b1d6-74e3-4042-9ed0-718bd0f8537c req-c4911d29-111c-4c5d-add7-02176e4df01f service nova] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Received unexpected event network-vif-plugged-892719ba-88a5-4998-9b27-c47babc15f5c for instance with vm_state deleted and task_state None. Apr 21 10:49:22 user nova-compute[70954]: DEBUG nova.compute.manager [req-0855b1d6-74e3-4042-9ed0-718bd0f8537c req-c4911d29-111c-4c5d-add7-02176e4df01f service nova] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Received event network-vif-plugged-892719ba-88a5-4998-9b27-c47babc15f5c {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:49:22 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-0855b1d6-74e3-4042-9ed0-718bd0f8537c req-c4911d29-111c-4c5d-add7-02176e4df01f service nova] Acquiring lock "aecf1ba8-9675-4535-874b-9084361b7693-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:49:22 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-0855b1d6-74e3-4042-9ed0-718bd0f8537c req-c4911d29-111c-4c5d-add7-02176e4df01f service nova] Lock "aecf1ba8-9675-4535-874b-9084361b7693-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:49:22 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-0855b1d6-74e3-4042-9ed0-718bd0f8537c req-c4911d29-111c-4c5d-add7-02176e4df01f service nova] Lock "aecf1ba8-9675-4535-874b-9084361b7693-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:49:22 user nova-compute[70954]: DEBUG nova.compute.manager [req-0855b1d6-74e3-4042-9ed0-718bd0f8537c req-c4911d29-111c-4c5d-add7-02176e4df01f service nova] [instance: aecf1ba8-9675-4535-874b-9084361b7693] No waiting events found dispatching network-vif-plugged-892719ba-88a5-4998-9b27-c47babc15f5c {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:49:22 user nova-compute[70954]: WARNING nova.compute.manager [req-0855b1d6-74e3-4042-9ed0-718bd0f8537c req-c4911d29-111c-4c5d-add7-02176e4df01f service nova] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Received unexpected event network-vif-plugged-892719ba-88a5-4998-9b27-c47babc15f5c for instance with vm_state deleted and task_state None. Apr 21 10:49:22 user nova-compute[70954]: DEBUG nova.compute.manager [req-0855b1d6-74e3-4042-9ed0-718bd0f8537c req-c4911d29-111c-4c5d-add7-02176e4df01f service nova] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Received event network-vif-plugged-892719ba-88a5-4998-9b27-c47babc15f5c {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:49:22 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-0855b1d6-74e3-4042-9ed0-718bd0f8537c req-c4911d29-111c-4c5d-add7-02176e4df01f service nova] Acquiring lock "aecf1ba8-9675-4535-874b-9084361b7693-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:49:22 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-0855b1d6-74e3-4042-9ed0-718bd0f8537c req-c4911d29-111c-4c5d-add7-02176e4df01f service nova] Lock "aecf1ba8-9675-4535-874b-9084361b7693-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:49:22 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-0855b1d6-74e3-4042-9ed0-718bd0f8537c req-c4911d29-111c-4c5d-add7-02176e4df01f service nova] Lock "aecf1ba8-9675-4535-874b-9084361b7693-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:49:22 user nova-compute[70954]: DEBUG nova.compute.manager [req-0855b1d6-74e3-4042-9ed0-718bd0f8537c req-c4911d29-111c-4c5d-add7-02176e4df01f service nova] [instance: aecf1ba8-9675-4535-874b-9084361b7693] No waiting events found dispatching network-vif-plugged-892719ba-88a5-4998-9b27-c47babc15f5c {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:49:22 user nova-compute[70954]: WARNING nova.compute.manager [req-0855b1d6-74e3-4042-9ed0-718bd0f8537c req-c4911d29-111c-4c5d-add7-02176e4df01f service nova] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Received unexpected event network-vif-plugged-892719ba-88a5-4998-9b27-c47babc15f5c for instance with vm_state deleted and task_state None. Apr 21 10:49:22 user nova-compute[70954]: DEBUG nova.compute.manager [req-0855b1d6-74e3-4042-9ed0-718bd0f8537c req-c4911d29-111c-4c5d-add7-02176e4df01f service nova] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Received event network-vif-unplugged-892719ba-88a5-4998-9b27-c47babc15f5c {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:49:22 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-0855b1d6-74e3-4042-9ed0-718bd0f8537c req-c4911d29-111c-4c5d-add7-02176e4df01f service nova] Acquiring lock "aecf1ba8-9675-4535-874b-9084361b7693-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:49:22 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-0855b1d6-74e3-4042-9ed0-718bd0f8537c req-c4911d29-111c-4c5d-add7-02176e4df01f service nova] Lock "aecf1ba8-9675-4535-874b-9084361b7693-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:49:22 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-0855b1d6-74e3-4042-9ed0-718bd0f8537c req-c4911d29-111c-4c5d-add7-02176e4df01f service nova] Lock "aecf1ba8-9675-4535-874b-9084361b7693-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:49:22 user nova-compute[70954]: DEBUG nova.compute.manager [req-0855b1d6-74e3-4042-9ed0-718bd0f8537c req-c4911d29-111c-4c5d-add7-02176e4df01f service nova] [instance: aecf1ba8-9675-4535-874b-9084361b7693] No waiting events found dispatching network-vif-unplugged-892719ba-88a5-4998-9b27-c47babc15f5c {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:49:22 user nova-compute[70954]: WARNING nova.compute.manager [req-0855b1d6-74e3-4042-9ed0-718bd0f8537c req-c4911d29-111c-4c5d-add7-02176e4df01f service nova] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Received unexpected event network-vif-unplugged-892719ba-88a5-4998-9b27-c47babc15f5c for instance with vm_state deleted and task_state None. Apr 21 10:49:22 user nova-compute[70954]: DEBUG nova.compute.manager [req-0855b1d6-74e3-4042-9ed0-718bd0f8537c req-c4911d29-111c-4c5d-add7-02176e4df01f service nova] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Received event network-vif-plugged-892719ba-88a5-4998-9b27-c47babc15f5c {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:49:22 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-0855b1d6-74e3-4042-9ed0-718bd0f8537c req-c4911d29-111c-4c5d-add7-02176e4df01f service nova] Acquiring lock "aecf1ba8-9675-4535-874b-9084361b7693-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:49:22 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-0855b1d6-74e3-4042-9ed0-718bd0f8537c req-c4911d29-111c-4c5d-add7-02176e4df01f service nova] Lock "aecf1ba8-9675-4535-874b-9084361b7693-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:49:22 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-0855b1d6-74e3-4042-9ed0-718bd0f8537c req-c4911d29-111c-4c5d-add7-02176e4df01f service nova] Lock "aecf1ba8-9675-4535-874b-9084361b7693-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:49:22 user nova-compute[70954]: DEBUG nova.compute.manager [req-0855b1d6-74e3-4042-9ed0-718bd0f8537c req-c4911d29-111c-4c5d-add7-02176e4df01f service nova] [instance: aecf1ba8-9675-4535-874b-9084361b7693] No waiting events found dispatching network-vif-plugged-892719ba-88a5-4998-9b27-c47babc15f5c {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:49:22 user nova-compute[70954]: WARNING nova.compute.manager [req-0855b1d6-74e3-4042-9ed0-718bd0f8537c req-c4911d29-111c-4c5d-add7-02176e4df01f service nova] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Received unexpected event network-vif-plugged-892719ba-88a5-4998-9b27-c47babc15f5c for instance with vm_state deleted and task_state None. Apr 21 10:49:22 user nova-compute[70954]: DEBUG nova.compute.manager [req-0855b1d6-74e3-4042-9ed0-718bd0f8537c req-c4911d29-111c-4c5d-add7-02176e4df01f service nova] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Received event network-vif-deleted-892719ba-88a5-4998-9b27-c47babc15f5c {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:49:22 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-c4065aa8-1fcb-406a-8b42-fc30bf71de4c tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:49:22 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-c4065aa8-1fcb-406a-8b42-fc30bf71de4c tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:49:22 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-c4065aa8-1fcb-406a-8b42-fc30bf71de4c tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.477s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:49:23 user nova-compute[70954]: INFO nova.scheduler.client.report [None req-c4065aa8-1fcb-406a-8b42-fc30bf71de4c tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Deleted allocations for instance aecf1ba8-9675-4535-874b-9084361b7693 Apr 21 10:49:23 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-c4065aa8-1fcb-406a-8b42-fc30bf71de4c tempest-AttachSCSIVolumeTestJSON-1586367620 tempest-AttachSCSIVolumeTestJSON-1586367620-project-member] Lock "aecf1ba8-9675-4535-874b-9084361b7693" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.745s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:49:24 user nova-compute[70954]: DEBUG nova.compute.manager [req-4c934d8b-d3df-42ad-bf87-bbd673083357 req-cdb2a2e9-7908-47c3-97ea-57b26a6776ee service nova] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Received event network-changed-44d4e2d5-0850-4b05-9d97-f3916611f340 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:49:24 user nova-compute[70954]: DEBUG nova.compute.manager [req-4c934d8b-d3df-42ad-bf87-bbd673083357 req-cdb2a2e9-7908-47c3-97ea-57b26a6776ee service nova] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Refreshing instance network info cache due to event network-changed-44d4e2d5-0850-4b05-9d97-f3916611f340. {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 10:49:24 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-4c934d8b-d3df-42ad-bf87-bbd673083357 req-cdb2a2e9-7908-47c3-97ea-57b26a6776ee service nova] Acquiring lock "refresh_cache-8ae797bd-c587-43a3-b941-e6d6d6c74e51" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:49:24 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-4c934d8b-d3df-42ad-bf87-bbd673083357 req-cdb2a2e9-7908-47c3-97ea-57b26a6776ee service nova] Acquired lock "refresh_cache-8ae797bd-c587-43a3-b941-e6d6d6c74e51" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:49:24 user nova-compute[70954]: DEBUG nova.network.neutron [req-4c934d8b-d3df-42ad-bf87-bbd673083357 req-cdb2a2e9-7908-47c3-97ea-57b26a6776ee service nova] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Refreshing network info cache for port 44d4e2d5-0850-4b05-9d97-f3916611f340 {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 10:49:25 user nova-compute[70954]: DEBUG nova.network.neutron [req-4c934d8b-d3df-42ad-bf87-bbd673083357 req-cdb2a2e9-7908-47c3-97ea-57b26a6776ee service nova] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Updated VIF entry in instance network info cache for port 44d4e2d5-0850-4b05-9d97-f3916611f340. {{(pid=70954) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 10:49:25 user nova-compute[70954]: DEBUG nova.network.neutron [req-4c934d8b-d3df-42ad-bf87-bbd673083357 req-cdb2a2e9-7908-47c3-97ea-57b26a6776ee service nova] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Updating instance_info_cache with network_info: [{"id": "44d4e2d5-0850-4b05-9d97-f3916611f340", "address": "fa:16:3e:d2:a2:e1", "network": {"id": "fcf7861e-296e-4706-871b-557b594e17c3", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-610768075-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "94e77e1735854e0c966c42e9a613017f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap44d4e2d5-08", "ovs_interfaceid": "44d4e2d5-0850-4b05-9d97-f3916611f340", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:49:25 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-4c934d8b-d3df-42ad-bf87-bbd673083357 req-cdb2a2e9-7908-47c3-97ea-57b26a6776ee service nova] Releasing lock "refresh_cache-8ae797bd-c587-43a3-b941-e6d6d6c74e51" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:49:26 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:27 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Acquiring lock "3dd95a26-8652-40f5-b357-3cbc8a38628a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:49:27 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Lock "3dd95a26-8652-40f5-b357-3cbc8a38628a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:49:27 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Starting instance... {{(pid=70954) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 10:49:27 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:49:27 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:49:27 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70954) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 10:49:27 user nova-compute[70954]: INFO nova.compute.claims [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Claim successful on node user Apr 21 10:49:27 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:28 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:49:28 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:49:28 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.327s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:49:28 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Start building networks asynchronously for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 10:49:28 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Allocating IP information in the background. {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 10:49:28 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] allocate_for_instance() {{(pid=70954) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 10:49:28 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 10:49:28 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Start building block device mappings for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 10:49:28 user nova-compute[70954]: DEBUG nova.policy [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c600e01acfe140cabcdfe54958e66108', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '94e77e1735854e0c966c42e9a613017f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70954) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 10:49:28 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Start spawning the instance on the hypervisor. {{(pid=70954) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 10:49:28 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Creating instance directory {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 10:49:28 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Creating image(s) Apr 21 10:49:28 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Acquiring lock "/opt/stack/data/nova/instances/3dd95a26-8652-40f5-b357-3cbc8a38628a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:49:28 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Lock "/opt/stack/data/nova/instances/3dd95a26-8652-40f5-b357-3cbc8a38628a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:49:28 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Lock "/opt/stack/data/nova/instances/3dd95a26-8652-40f5-b357-3cbc8a38628a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.007s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:49:28 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:49:28 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.132s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:49:28 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Acquiring lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:49:28 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:49:28 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:49:28 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.134s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:49:28 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/3dd95a26-8652-40f5-b357-3cbc8a38628a/disk 1073741824 {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:49:28 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/3dd95a26-8652-40f5-b357-3cbc8a38628a/disk 1073741824" returned: 0 in 0.050s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:49:28 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.192s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:49:28 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:49:28 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.155s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:49:28 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Checking if we can resize image /opt/stack/data/nova/instances/3dd95a26-8652-40f5-b357-3cbc8a38628a/disk. size=1073741824 {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 10:49:28 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3dd95a26-8652-40f5-b357-3cbc8a38628a/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:49:28 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Successfully created port: 76364ccf-028e-4291-b953-431265bcfabb {{(pid=70954) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 10:49:28 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3dd95a26-8652-40f5-b357-3cbc8a38628a/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:49:28 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Cannot resize image /opt/stack/data/nova/instances/3dd95a26-8652-40f5-b357-3cbc8a38628a/disk to a smaller size. {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 10:49:28 user nova-compute[70954]: DEBUG nova.objects.instance [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Lazy-loading 'migration_context' on Instance uuid 3dd95a26-8652-40f5-b357-3cbc8a38628a {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:49:28 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Created local disks {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 10:49:28 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Ensure instance console log exists: /opt/stack/data/nova/instances/3dd95a26-8652-40f5-b357-3cbc8a38628a/console.log {{(pid=70954) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 10:49:28 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:49:28 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:49:28 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:49:29 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Successfully updated port: 76364ccf-028e-4291-b953-431265bcfabb {{(pid=70954) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 10:49:29 user nova-compute[70954]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:49:29 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] VM Stopped (Lifecycle Event) Apr 21 10:49:29 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Acquiring lock "refresh_cache-3dd95a26-8652-40f5-b357-3cbc8a38628a" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:49:29 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Acquired lock "refresh_cache-3dd95a26-8652-40f5-b357-3cbc8a38628a" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:49:29 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Building network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 10:49:29 user nova-compute[70954]: DEBUG nova.compute.manager [None req-9d0b6d8b-efe3-49c8-bf01-a90f36e6c9a5 None None] [instance: dd4d15a1-3a71-49e8-9851-9b49fec6a9e3] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:49:29 user nova-compute[70954]: DEBUG nova.compute.manager [req-5c4c4293-e4c9-4c42-930e-ed7b503dca74 req-4613b2d3-3c8e-4f8c-b67d-6481c527c3b2 service nova] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Received event network-changed-76364ccf-028e-4291-b953-431265bcfabb {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:49:29 user nova-compute[70954]: DEBUG nova.compute.manager [req-5c4c4293-e4c9-4c42-930e-ed7b503dca74 req-4613b2d3-3c8e-4f8c-b67d-6481c527c3b2 service nova] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Refreshing instance network info cache due to event network-changed-76364ccf-028e-4291-b953-431265bcfabb. {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 10:49:29 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-5c4c4293-e4c9-4c42-930e-ed7b503dca74 req-4613b2d3-3c8e-4f8c-b67d-6481c527c3b2 service nova] Acquiring lock "refresh_cache-3dd95a26-8652-40f5-b357-3cbc8a38628a" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:49:29 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Instance cache missing network info. {{(pid=70954) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 10:49:29 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Updating instance_info_cache with network_info: [{"id": "76364ccf-028e-4291-b953-431265bcfabb", "address": "fa:16:3e:10:ff:81", "network": {"id": "fcf7861e-296e-4706-871b-557b594e17c3", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-610768075-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "94e77e1735854e0c966c42e9a613017f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap76364ccf-02", "ovs_interfaceid": "76364ccf-028e-4291-b953-431265bcfabb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:49:29 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Releasing lock "refresh_cache-3dd95a26-8652-40f5-b357-3cbc8a38628a" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:49:29 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Instance network_info: |[{"id": "76364ccf-028e-4291-b953-431265bcfabb", "address": "fa:16:3e:10:ff:81", "network": {"id": "fcf7861e-296e-4706-871b-557b594e17c3", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-610768075-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "94e77e1735854e0c966c42e9a613017f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap76364ccf-02", "ovs_interfaceid": "76364ccf-028e-4291-b953-431265bcfabb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 10:49:29 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-5c4c4293-e4c9-4c42-930e-ed7b503dca74 req-4613b2d3-3c8e-4f8c-b67d-6481c527c3b2 service nova] Acquired lock "refresh_cache-3dd95a26-8652-40f5-b357-3cbc8a38628a" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:49:29 user nova-compute[70954]: DEBUG nova.network.neutron [req-5c4c4293-e4c9-4c42-930e-ed7b503dca74 req-4613b2d3-3c8e-4f8c-b67d-6481c527c3b2 service nova] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Refreshing network info cache for port 76364ccf-028e-4291-b953-431265bcfabb {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 10:49:29 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Start _get_guest_xml network_info=[{"id": "76364ccf-028e-4291-b953-431265bcfabb", "address": "fa:16:3e:10:ff:81", "network": {"id": "fcf7861e-296e-4706-871b-557b594e17c3", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-610768075-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "94e77e1735854e0c966c42e9a613017f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap76364ccf-02", "ovs_interfaceid": "76364ccf-028e-4291-b953-431265bcfabb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:43:25Z,direct_url=,disk_format='qcow2',id=3b29a01a-1fc0-4d0d-89fb-23d22b2de02e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='a3109aa78f014d0da3638064a889676d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:43:26Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'size': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'disk_bus': 'virtio', 'encryption_format': None, 'guest_format': None, 'image_id': '3b29a01a-1fc0-4d0d-89fb-23d22b2de02e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 10:49:29 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:49:29 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:49:29 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70954) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 10:49:29 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T10:44:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:43:25Z,direct_url=,disk_format='qcow2',id=3b29a01a-1fc0-4d0d-89fb-23d22b2de02e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='a3109aa78f014d0da3638064a889676d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:43:26Z,virtual_size=,visibility=), allow threads: True {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 10:49:29 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Flavor limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 10:49:29 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Image limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 10:49:29 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Flavor pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 10:49:29 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Image pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 10:49:29 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 10:49:29 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 10:49:29 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 10:49:29 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Got 1 possible topologies {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 10:49:29 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 10:49:29 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 10:49:29 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:49:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-1890729888',display_name='tempest-VolumesAdminNegativeTest-server-1890729888',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-1890729888',id=9,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='94e77e1735854e0c966c42e9a613017f',ramdisk_id='',reservation_id='r-bg3jhvup',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-243340095',owner_user_name='tempest-VolumesAdminNegativeTest-243340095-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:49:28Z,user_data=None,user_id='c600e01acfe140cabcdfe54958e66108',uuid=3dd95a26-8652-40f5-b357-3cbc8a38628a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "76364ccf-028e-4291-b953-431265bcfabb", "address": "fa:16:3e:10:ff:81", "network": {"id": "fcf7861e-296e-4706-871b-557b594e17c3", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-610768075-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "94e77e1735854e0c966c42e9a613017f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap76364ccf-02", "ovs_interfaceid": "76364ccf-028e-4291-b953-431265bcfabb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70954) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 10:49:29 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Converting VIF {"id": "76364ccf-028e-4291-b953-431265bcfabb", "address": "fa:16:3e:10:ff:81", "network": {"id": "fcf7861e-296e-4706-871b-557b594e17c3", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-610768075-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "94e77e1735854e0c966c42e9a613017f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap76364ccf-02", "ovs_interfaceid": "76364ccf-028e-4291-b953-431265bcfabb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:49:29 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:10:ff:81,bridge_name='br-int',has_traffic_filtering=True,id=76364ccf-028e-4291-b953-431265bcfabb,network=Network(fcf7861e-296e-4706-871b-557b594e17c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap76364ccf-02') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:49:29 user nova-compute[70954]: DEBUG nova.objects.instance [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Lazy-loading 'pci_devices' on Instance uuid 3dd95a26-8652-40f5-b357-3cbc8a38628a {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:49:30 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] End _get_guest_xml xml= Apr 21 10:49:30 user nova-compute[70954]: 3dd95a26-8652-40f5-b357-3cbc8a38628a Apr 21 10:49:30 user nova-compute[70954]: instance-00000009 Apr 21 10:49:30 user nova-compute[70954]: 131072 Apr 21 10:49:30 user nova-compute[70954]: 1 Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: tempest-VolumesAdminNegativeTest-server-1890729888 Apr 21 10:49:30 user nova-compute[70954]: 2023-04-21 10:49:29 Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: 128 Apr 21 10:49:30 user nova-compute[70954]: 1 Apr 21 10:49:30 user nova-compute[70954]: 0 Apr 21 10:49:30 user nova-compute[70954]: 0 Apr 21 10:49:30 user nova-compute[70954]: 1 Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: tempest-VolumesAdminNegativeTest-243340095-project-member Apr 21 10:49:30 user nova-compute[70954]: tempest-VolumesAdminNegativeTest-243340095 Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: OpenStack Foundation Apr 21 10:49:30 user nova-compute[70954]: OpenStack Nova Apr 21 10:49:30 user nova-compute[70954]: 0.0.0 Apr 21 10:49:30 user nova-compute[70954]: 3dd95a26-8652-40f5-b357-3cbc8a38628a Apr 21 10:49:30 user nova-compute[70954]: 3dd95a26-8652-40f5-b357-3cbc8a38628a Apr 21 10:49:30 user nova-compute[70954]: Virtual Machine Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: hvm Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Nehalem Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: /dev/urandom Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: Apr 21 10:49:30 user nova-compute[70954]: {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 10:49:30 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:49:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-1890729888',display_name='tempest-VolumesAdminNegativeTest-server-1890729888',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-1890729888',id=9,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='94e77e1735854e0c966c42e9a613017f',ramdisk_id='',reservation_id='r-bg3jhvup',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-243340095',owner_user_name='tempest-VolumesAdminNegativeTest-243340095-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:49:28Z,user_data=None,user_id='c600e01acfe140cabcdfe54958e66108',uuid=3dd95a26-8652-40f5-b357-3cbc8a38628a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "76364ccf-028e-4291-b953-431265bcfabb", "address": "fa:16:3e:10:ff:81", "network": {"id": "fcf7861e-296e-4706-871b-557b594e17c3", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-610768075-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "94e77e1735854e0c966c42e9a613017f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap76364ccf-02", "ovs_interfaceid": "76364ccf-028e-4291-b953-431265bcfabb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 10:49:30 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Converting VIF {"id": "76364ccf-028e-4291-b953-431265bcfabb", "address": "fa:16:3e:10:ff:81", "network": {"id": "fcf7861e-296e-4706-871b-557b594e17c3", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-610768075-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "94e77e1735854e0c966c42e9a613017f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap76364ccf-02", "ovs_interfaceid": "76364ccf-028e-4291-b953-431265bcfabb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:49:30 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:10:ff:81,bridge_name='br-int',has_traffic_filtering=True,id=76364ccf-028e-4291-b953-431265bcfabb,network=Network(fcf7861e-296e-4706-871b-557b594e17c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap76364ccf-02') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:49:30 user nova-compute[70954]: DEBUG os_vif [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:ff:81,bridge_name='br-int',has_traffic_filtering=True,id=76364ccf-028e-4291-b953-431265bcfabb,network=Network(fcf7861e-296e-4706-871b-557b594e17c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap76364ccf-02') {{(pid=70954) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 10:49:30 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:30 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:49:30 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 10:49:30 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:30 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap76364ccf-02, may_exist=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:49:30 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap76364ccf-02, col_values=(('external_ids', {'iface-id': '76364ccf-028e-4291-b953-431265bcfabb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:10:ff:81', 'vm-uuid': '3dd95a26-8652-40f5-b357-3cbc8a38628a'}),)) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:49:30 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:49:30 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:30 user nova-compute[70954]: INFO os_vif [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:ff:81,bridge_name='br-int',has_traffic_filtering=True,id=76364ccf-028e-4291-b953-431265bcfabb,network=Network(fcf7861e-296e-4706-871b-557b594e17c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap76364ccf-02') Apr 21 10:49:30 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] No BDM found with device name vda, not building metadata. {{(pid=70954) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 10:49:30 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] No VIF found with MAC fa:16:3e:10:ff:81, not building metadata {{(pid=70954) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 10:49:30 user nova-compute[70954]: DEBUG nova.network.neutron [req-5c4c4293-e4c9-4c42-930e-ed7b503dca74 req-4613b2d3-3c8e-4f8c-b67d-6481c527c3b2 service nova] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Updated VIF entry in instance network info cache for port 76364ccf-028e-4291-b953-431265bcfabb. {{(pid=70954) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 10:49:30 user nova-compute[70954]: DEBUG nova.network.neutron [req-5c4c4293-e4c9-4c42-930e-ed7b503dca74 req-4613b2d3-3c8e-4f8c-b67d-6481c527c3b2 service nova] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Updating instance_info_cache with network_info: [{"id": "76364ccf-028e-4291-b953-431265bcfabb", "address": "fa:16:3e:10:ff:81", "network": {"id": "fcf7861e-296e-4706-871b-557b594e17c3", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-610768075-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "94e77e1735854e0c966c42e9a613017f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap76364ccf-02", "ovs_interfaceid": "76364ccf-028e-4291-b953-431265bcfabb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:49:30 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-5c4c4293-e4c9-4c42-930e-ed7b503dca74 req-4613b2d3-3c8e-4f8c-b67d-6481c527c3b2 service nova] Releasing lock "refresh_cache-3dd95a26-8652-40f5-b357-3cbc8a38628a" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:49:31 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:31 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:31 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:31 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:31 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:31 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:31 user nova-compute[70954]: DEBUG nova.compute.manager [req-2af8a62d-c358-4fbb-a46c-f86d31faed7c req-a81e189e-d3d0-4fe4-ae0e-976b51eac39f service nova] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Received event network-vif-plugged-76364ccf-028e-4291-b953-431265bcfabb {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:49:31 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-2af8a62d-c358-4fbb-a46c-f86d31faed7c req-a81e189e-d3d0-4fe4-ae0e-976b51eac39f service nova] Acquiring lock "3dd95a26-8652-40f5-b357-3cbc8a38628a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:49:31 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-2af8a62d-c358-4fbb-a46c-f86d31faed7c req-a81e189e-d3d0-4fe4-ae0e-976b51eac39f service nova] Lock "3dd95a26-8652-40f5-b357-3cbc8a38628a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:49:31 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-2af8a62d-c358-4fbb-a46c-f86d31faed7c req-a81e189e-d3d0-4fe4-ae0e-976b51eac39f service nova] Lock "3dd95a26-8652-40f5-b357-3cbc8a38628a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:49:31 user nova-compute[70954]: DEBUG nova.compute.manager [req-2af8a62d-c358-4fbb-a46c-f86d31faed7c req-a81e189e-d3d0-4fe4-ae0e-976b51eac39f service nova] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] No waiting events found dispatching network-vif-plugged-76364ccf-028e-4291-b953-431265bcfabb {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:49:31 user nova-compute[70954]: WARNING nova.compute.manager [req-2af8a62d-c358-4fbb-a46c-f86d31faed7c req-a81e189e-d3d0-4fe4-ae0e-976b51eac39f service nova] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Received unexpected event network-vif-plugged-76364ccf-028e-4291-b953-431265bcfabb for instance with vm_state building and task_state spawning. Apr 21 10:49:32 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:33 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Resumed> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:49:33 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] VM Resumed (Lifecycle Event) Apr 21 10:49:33 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Instance event wait completed in 0 seconds for {{(pid=70954) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 10:49:33 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Guest created on hypervisor {{(pid=70954) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 10:49:33 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Instance spawned successfully. Apr 21 10:49:33 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 10:49:33 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:49:33 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:49:33 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Found default for hw_cdrom_bus of ide {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:49:33 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Found default for hw_disk_bus of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:49:33 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Found default for hw_input_bus of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:49:33 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Found default for hw_pointer_model of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:49:33 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Found default for hw_video_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:49:33 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Found default for hw_vif_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:49:33 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 10:49:33 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Started> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:49:33 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] VM Started (Lifecycle Event) Apr 21 10:49:33 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:49:33 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:49:33 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 10:49:33 user nova-compute[70954]: INFO nova.compute.manager [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Took 5.31 seconds to spawn the instance on the hypervisor. Apr 21 10:49:33 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:49:33 user nova-compute[70954]: INFO nova.compute.manager [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Took 5.96 seconds to build instance. Apr 21 10:49:33 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d1953efa-c961-4dce-9abd-afa4e66c6439 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Lock "3dd95a26-8652-40f5-b357-3cbc8a38628a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.063s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:49:33 user nova-compute[70954]: DEBUG nova.compute.manager [req-ee1868bb-f2a8-4a00-82c6-339872839ab2 req-82a7f9e9-d4ae-45c3-8bc9-5f7fe6d94935 service nova] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Received event network-vif-plugged-76364ccf-028e-4291-b953-431265bcfabb {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:49:33 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-ee1868bb-f2a8-4a00-82c6-339872839ab2 req-82a7f9e9-d4ae-45c3-8bc9-5f7fe6d94935 service nova] Acquiring lock "3dd95a26-8652-40f5-b357-3cbc8a38628a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:49:33 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-ee1868bb-f2a8-4a00-82c6-339872839ab2 req-82a7f9e9-d4ae-45c3-8bc9-5f7fe6d94935 service nova] Lock "3dd95a26-8652-40f5-b357-3cbc8a38628a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:49:33 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-ee1868bb-f2a8-4a00-82c6-339872839ab2 req-82a7f9e9-d4ae-45c3-8bc9-5f7fe6d94935 service nova] Lock "3dd95a26-8652-40f5-b357-3cbc8a38628a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:49:33 user nova-compute[70954]: DEBUG nova.compute.manager [req-ee1868bb-f2a8-4a00-82c6-339872839ab2 req-82a7f9e9-d4ae-45c3-8bc9-5f7fe6d94935 service nova] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] No waiting events found dispatching network-vif-plugged-76364ccf-028e-4291-b953-431265bcfabb {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:49:33 user nova-compute[70954]: WARNING nova.compute.manager [req-ee1868bb-f2a8-4a00-82c6-339872839ab2 req-82a7f9e9-d4ae-45c3-8bc9-5f7fe6d94935 service nova] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Received unexpected event network-vif-plugged-76364ccf-028e-4291-b953-431265bcfabb for instance with vm_state active and task_state None. Apr 21 10:49:35 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:36 user nova-compute[70954]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:49:36 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: aecf1ba8-9675-4535-874b-9084361b7693] VM Stopped (Lifecycle Event) Apr 21 10:49:36 user nova-compute[70954]: DEBUG nova.compute.manager [None req-248f7032-33ca-40eb-a999-aa5fb52473f2 None None] [instance: aecf1ba8-9675-4535-874b-9084361b7693] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:49:40 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:49:40 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:40 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=70954) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 21 10:49:40 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 10:49:40 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 10:49:40 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:42 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:45 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:47 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:50 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:52 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:54 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-88f383d3-7f6c-41fc-a57c-36204d733a3e tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Acquiring lock "c3100b42-46b3-4371-89f2-e511ca1ce6cd" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:49:54 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-88f383d3-7f6c-41fc-a57c-36204d733a3e tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Lock "c3100b42-46b3-4371-89f2-e511ca1ce6cd" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:49:54 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-88f383d3-7f6c-41fc-a57c-36204d733a3e tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Acquiring lock "c3100b42-46b3-4371-89f2-e511ca1ce6cd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:49:54 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-88f383d3-7f6c-41fc-a57c-36204d733a3e tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Lock "c3100b42-46b3-4371-89f2-e511ca1ce6cd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:49:54 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-88f383d3-7f6c-41fc-a57c-36204d733a3e tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Lock "c3100b42-46b3-4371-89f2-e511ca1ce6cd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:49:54 user nova-compute[70954]: INFO nova.compute.manager [None req-88f383d3-7f6c-41fc-a57c-36204d733a3e tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Terminating instance Apr 21 10:49:54 user nova-compute[70954]: DEBUG nova.compute.manager [None req-88f383d3-7f6c-41fc-a57c-36204d733a3e tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Start destroying the instance on the hypervisor. {{(pid=70954) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 10:49:54 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:55 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:55 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:55 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:55 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:55 user nova-compute[70954]: DEBUG nova.compute.manager [req-ccd1db8b-7566-487e-b26c-904cf50a7391 req-947ea6ce-1e1d-40e4-a7ca-d82cf38b1da2 service nova] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Received event network-vif-unplugged-be8924d4-464a-4572-8b2f-96b2f230297f {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:49:55 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-ccd1db8b-7566-487e-b26c-904cf50a7391 req-947ea6ce-1e1d-40e4-a7ca-d82cf38b1da2 service nova] Acquiring lock "c3100b42-46b3-4371-89f2-e511ca1ce6cd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:49:55 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-ccd1db8b-7566-487e-b26c-904cf50a7391 req-947ea6ce-1e1d-40e4-a7ca-d82cf38b1da2 service nova] Lock "c3100b42-46b3-4371-89f2-e511ca1ce6cd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:49:55 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-ccd1db8b-7566-487e-b26c-904cf50a7391 req-947ea6ce-1e1d-40e4-a7ca-d82cf38b1da2 service nova] Lock "c3100b42-46b3-4371-89f2-e511ca1ce6cd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:49:55 user nova-compute[70954]: DEBUG nova.compute.manager [req-ccd1db8b-7566-487e-b26c-904cf50a7391 req-947ea6ce-1e1d-40e4-a7ca-d82cf38b1da2 service nova] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] No waiting events found dispatching network-vif-unplugged-be8924d4-464a-4572-8b2f-96b2f230297f {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:49:55 user nova-compute[70954]: DEBUG nova.compute.manager [req-ccd1db8b-7566-487e-b26c-904cf50a7391 req-947ea6ce-1e1d-40e4-a7ca-d82cf38b1da2 service nova] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Received event network-vif-unplugged-be8924d4-464a-4572-8b2f-96b2f230297f for instance with task_state deleting. {{(pid=70954) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 10:49:55 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Instance destroyed successfully. Apr 21 10:49:55 user nova-compute[70954]: DEBUG nova.objects.instance [None req-88f383d3-7f6c-41fc-a57c-36204d733a3e tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Lazy-loading 'resources' on Instance uuid c3100b42-46b3-4371-89f2-e511ca1ce6cd {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:49:55 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-88f383d3-7f6c-41fc-a57c-36204d733a3e tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:48:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-SnapshotDataIntegrityTests-server-1967603297',display_name='tempest-SnapshotDataIntegrityTests-server-1967603297',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-snapshotdataintegritytests-server-1967603297',id=8,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOI+lxaYCiJ6Cakfh0d0/M3EQtNn7ayCKiNgO3J0ChauJCxkwfuH2I2Rjm736o1FW/bz/bZeZnhFBJEXMyBImhjphifTfaav3xqO3xhVAU45T4aDHcFYSZ0q9YF9LVtTWQ==',key_name='tempest-SnapshotDataIntegrityTests-1505495754',keypairs=,launch_index=0,launched_at=2023-04-21T10:48:10Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='52aece0e34a3451da50638e2930424e7',ramdisk_id='',reservation_id='r-w2cfhfmy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-SnapshotDataIntegrityTests-816712956',owner_user_name='tempest-SnapshotDataIntegrityTests-816712956-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T10:48:11Z,user_data=None,user_id='0d7234a4dcce4289a84f5060f546efb6',uuid=c3100b42-46b3-4371-89f2-e511ca1ce6cd,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "be8924d4-464a-4572-8b2f-96b2f230297f", "address": "fa:16:3e:e5:eb:e1", "network": {"id": "624bf70c-30f1-41f5-b380-69af8cfb5fd6", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-675355002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "52aece0e34a3451da50638e2930424e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe8924d4-46", "ovs_interfaceid": "be8924d4-464a-4572-8b2f-96b2f230297f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 10:49:55 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-88f383d3-7f6c-41fc-a57c-36204d733a3e tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Converting VIF {"id": "be8924d4-464a-4572-8b2f-96b2f230297f", "address": "fa:16:3e:e5:eb:e1", "network": {"id": "624bf70c-30f1-41f5-b380-69af8cfb5fd6", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-675355002-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "52aece0e34a3451da50638e2930424e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe8924d4-46", "ovs_interfaceid": "be8924d4-464a-4572-8b2f-96b2f230297f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:49:55 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-88f383d3-7f6c-41fc-a57c-36204d733a3e tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:eb:e1,bridge_name='br-int',has_traffic_filtering=True,id=be8924d4-464a-4572-8b2f-96b2f230297f,network=Network(624bf70c-30f1-41f5-b380-69af8cfb5fd6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe8924d4-46') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:49:55 user nova-compute[70954]: DEBUG os_vif [None req-88f383d3-7f6c-41fc-a57c-36204d733a3e tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:eb:e1,bridge_name='br-int',has_traffic_filtering=True,id=be8924d4-464a-4572-8b2f-96b2f230297f,network=Network(624bf70c-30f1-41f5-b380-69af8cfb5fd6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe8924d4-46') {{(pid=70954) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 10:49:55 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:55 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbe8924d4-46, bridge=br-int, if_exists=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:49:55 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:55 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:49:55 user nova-compute[70954]: INFO os_vif [None req-88f383d3-7f6c-41fc-a57c-36204d733a3e tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:eb:e1,bridge_name='br-int',has_traffic_filtering=True,id=be8924d4-464a-4572-8b2f-96b2f230297f,network=Network(624bf70c-30f1-41f5-b380-69af8cfb5fd6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe8924d4-46') Apr 21 10:49:55 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-88f383d3-7f6c-41fc-a57c-36204d733a3e tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Deleting instance files /opt/stack/data/nova/instances/c3100b42-46b3-4371-89f2-e511ca1ce6cd_del Apr 21 10:49:55 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-88f383d3-7f6c-41fc-a57c-36204d733a3e tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Deletion of /opt/stack/data/nova/instances/c3100b42-46b3-4371-89f2-e511ca1ce6cd_del complete Apr 21 10:49:55 user nova-compute[70954]: INFO nova.compute.manager [None req-88f383d3-7f6c-41fc-a57c-36204d733a3e tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Took 0.84 seconds to destroy the instance on the hypervisor. Apr 21 10:49:55 user nova-compute[70954]: DEBUG oslo.service.loopingcall [None req-88f383d3-7f6c-41fc-a57c-36204d733a3e tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70954) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 10:49:55 user nova-compute[70954]: DEBUG nova.compute.manager [-] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Deallocating network for instance {{(pid=70954) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 10:49:55 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] deallocate_for_instance() {{(pid=70954) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 10:49:56 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Updating instance_info_cache with network_info: [] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:49:56 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Took 0.46 seconds to deallocate network for instance. Apr 21 10:49:56 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-88f383d3-7f6c-41fc-a57c-36204d733a3e tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:49:56 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-88f383d3-7f6c-41fc-a57c-36204d733a3e tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:49:56 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-88f383d3-7f6c-41fc-a57c-36204d733a3e tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:49:56 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-88f383d3-7f6c-41fc-a57c-36204d733a3e tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:49:56 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-88f383d3-7f6c-41fc-a57c-36204d733a3e tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.267s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:49:56 user nova-compute[70954]: INFO nova.scheduler.client.report [None req-88f383d3-7f6c-41fc-a57c-36204d733a3e tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Deleted allocations for instance c3100b42-46b3-4371-89f2-e511ca1ce6cd Apr 21 10:49:56 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-88f383d3-7f6c-41fc-a57c-36204d733a3e tempest-SnapshotDataIntegrityTests-816712956 tempest-SnapshotDataIntegrityTests-816712956-project-member] Lock "c3100b42-46b3-4371-89f2-e511ca1ce6cd" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.753s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:49:57 user nova-compute[70954]: DEBUG nova.compute.manager [req-bc93ff9f-0af6-4f4d-a182-a762cf76f882 req-e0a28607-0eab-45b0-8b68-b888cdbc6fab service nova] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Received event network-vif-plugged-be8924d4-464a-4572-8b2f-96b2f230297f {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:49:57 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-bc93ff9f-0af6-4f4d-a182-a762cf76f882 req-e0a28607-0eab-45b0-8b68-b888cdbc6fab service nova] Acquiring lock "c3100b42-46b3-4371-89f2-e511ca1ce6cd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:49:57 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-bc93ff9f-0af6-4f4d-a182-a762cf76f882 req-e0a28607-0eab-45b0-8b68-b888cdbc6fab service nova] Lock "c3100b42-46b3-4371-89f2-e511ca1ce6cd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:49:57 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-bc93ff9f-0af6-4f4d-a182-a762cf76f882 req-e0a28607-0eab-45b0-8b68-b888cdbc6fab service nova] Lock "c3100b42-46b3-4371-89f2-e511ca1ce6cd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:49:57 user nova-compute[70954]: DEBUG nova.compute.manager [req-bc93ff9f-0af6-4f4d-a182-a762cf76f882 req-e0a28607-0eab-45b0-8b68-b888cdbc6fab service nova] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] No waiting events found dispatching network-vif-plugged-be8924d4-464a-4572-8b2f-96b2f230297f {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:49:57 user nova-compute[70954]: WARNING nova.compute.manager [req-bc93ff9f-0af6-4f4d-a182-a762cf76f882 req-e0a28607-0eab-45b0-8b68-b888cdbc6fab service nova] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Received unexpected event network-vif-plugged-be8924d4-464a-4572-8b2f-96b2f230297f for instance with vm_state deleted and task_state None. Apr 21 10:49:57 user nova-compute[70954]: DEBUG nova.compute.manager [req-bc93ff9f-0af6-4f4d-a182-a762cf76f882 req-e0a28607-0eab-45b0-8b68-b888cdbc6fab service nova] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Received event network-vif-deleted-be8924d4-464a-4572-8b2f-96b2f230297f {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:49:57 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:57 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:58 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:49:59 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:49:59 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:49:59 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Starting heal instance info cache {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 10:49:59 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "refresh_cache-f8609da3-c26d-482a-bc03-017baf4bce22" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:49:59 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquired lock "refresh_cache-f8609da3-c26d-482a-bc03-017baf4bce22" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:49:59 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Forcefully refreshing network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 21 10:50:00 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Updating instance_info_cache with network_info: [{"id": "f210779b-302b-4a17-8b57-07837ea54e12", "address": "fa:16:3e:c3:c6:d1", "network": {"id": "ba8e9ff2-e562-462e-a2fa-0d7f643da26c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-83296950-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.39", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "648163a728fc4b28b85a24e9198d356b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf210779b-30", "ovs_interfaceid": "f210779b-302b-4a17-8b57-07837ea54e12", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:50:00 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Releasing lock "refresh_cache-f8609da3-c26d-482a-bc03-017baf4bce22" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:50:00 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Updated the network info_cache for instance {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 21 10:50:00 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:50:00 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:00 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:50:00 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:50:00 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:50:00 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager.update_available_resource {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:50:00 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:50:00 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:50:00 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:50:00 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Auditing locally available compute resources for user (node: user) {{(pid=70954) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 10:50:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15bf9321-a92e-4be2-bcae-a943988c811a/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:50:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15bf9321-a92e-4be2-bcae-a943988c811a/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:50:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15bf9321-a92e-4be2-bcae-a943988c811a/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:50:01 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Acquiring lock "db58ccab-48e9-499e-a916-9a68709958d6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:50:01 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "db58ccab-48e9-499e-a916-9a68709958d6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:50:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15bf9321-a92e-4be2-bcae-a943988c811a/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:50:01 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: db58ccab-48e9-499e-a916-9a68709958d6] Starting instance... {{(pid=70954) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 10:50:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:50:01 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:50:01 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:50:01 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70954) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 10:50:01 user nova-compute[70954]: INFO nova.compute.claims [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: db58ccab-48e9-499e-a916-9a68709958d6] Claim successful on node user Apr 21 10:50:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:50:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:50:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:50:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:50:01 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:50:01 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:50:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:50:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:50:01 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.387s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:50:01 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: db58ccab-48e9-499e-a916-9a68709958d6] Start building networks asynchronously for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 10:50:01 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: db58ccab-48e9-499e-a916-9a68709958d6] Allocating IP information in the background. {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 10:50:01 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: db58ccab-48e9-499e-a916-9a68709958d6] allocate_for_instance() {{(pid=70954) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 10:50:01 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: db58ccab-48e9-499e-a916-9a68709958d6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 10:50:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:50:01 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: db58ccab-48e9-499e-a916-9a68709958d6] Start building block device mappings for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 10:50:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8ae797bd-c587-43a3-b941-e6d6d6c74e51/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:50:01 user nova-compute[70954]: INFO nova.virt.block_device [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: db58ccab-48e9-499e-a916-9a68709958d6] Booting with blank volume at /dev/vda Apr 21 10:50:02 user nova-compute[70954]: DEBUG nova.policy [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '54c67d90b6014d9ea24ef2552006bc04', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'aad84a0e014f47ddaeaddc88bf16b0a8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70954) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 10:50:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8ae797bd-c587-43a3-b941-e6d6d6c74e51/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:50:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8ae797bd-c587-43a3-b941-e6d6d6c74e51/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:50:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8ae797bd-c587-43a3-b941-e6d6d6c74e51/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:50:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3dd95a26-8652-40f5-b357-3cbc8a38628a/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:50:02 user nova-compute[70954]: WARNING nova.compute.manager [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Volume id: b5cb6e0d-2d7b-4bd6-ab3a-e7e007ff8b31 finished being created but its status is error. Apr 21 10:50:02 user nova-compute[70954]: ERROR nova.compute.manager [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: db58ccab-48e9-499e-a916-9a68709958d6] Instance failed block device setup: nova.exception.VolumeNotCreated: Volume b5cb6e0d-2d7b-4bd6-ab3a-e7e007ff8b31 did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. Apr 21 10:50:02 user nova-compute[70954]: ERROR nova.compute.manager [instance: db58ccab-48e9-499e-a916-9a68709958d6] Traceback (most recent call last): Apr 21 10:50:02 user nova-compute[70954]: ERROR nova.compute.manager [instance: db58ccab-48e9-499e-a916-9a68709958d6] File "/opt/stack/nova/nova/compute/manager.py", line 2175, in _prep_block_device Apr 21 10:50:02 user nova-compute[70954]: ERROR nova.compute.manager [instance: db58ccab-48e9-499e-a916-9a68709958d6] driver_block_device.attach_block_devices( Apr 21 10:50:02 user nova-compute[70954]: ERROR nova.compute.manager [instance: db58ccab-48e9-499e-a916-9a68709958d6] File "/opt/stack/nova/nova/virt/block_device.py", line 936, in attach_block_devices Apr 21 10:50:02 user nova-compute[70954]: ERROR nova.compute.manager [instance: db58ccab-48e9-499e-a916-9a68709958d6] _log_and_attach(device) Apr 21 10:50:02 user nova-compute[70954]: ERROR nova.compute.manager [instance: db58ccab-48e9-499e-a916-9a68709958d6] File "/opt/stack/nova/nova/virt/block_device.py", line 933, in _log_and_attach Apr 21 10:50:02 user nova-compute[70954]: ERROR nova.compute.manager [instance: db58ccab-48e9-499e-a916-9a68709958d6] bdm.attach(*attach_args, **attach_kwargs) Apr 21 10:50:02 user nova-compute[70954]: ERROR nova.compute.manager [instance: db58ccab-48e9-499e-a916-9a68709958d6] File "/opt/stack/nova/nova/virt/block_device.py", line 848, in attach Apr 21 10:50:02 user nova-compute[70954]: ERROR nova.compute.manager [instance: db58ccab-48e9-499e-a916-9a68709958d6] self.volume_id, self.attachment_id = self._create_volume( Apr 21 10:50:02 user nova-compute[70954]: ERROR nova.compute.manager [instance: db58ccab-48e9-499e-a916-9a68709958d6] File "/opt/stack/nova/nova/virt/block_device.py", line 435, in _create_volume Apr 21 10:50:02 user nova-compute[70954]: ERROR nova.compute.manager [instance: db58ccab-48e9-499e-a916-9a68709958d6] self._call_wait_func(context, wait_func, volume_api, vol['id']) Apr 21 10:50:02 user nova-compute[70954]: ERROR nova.compute.manager [instance: db58ccab-48e9-499e-a916-9a68709958d6] File "/opt/stack/nova/nova/virt/block_device.py", line 785, in _call_wait_func Apr 21 10:50:02 user nova-compute[70954]: ERROR nova.compute.manager [instance: db58ccab-48e9-499e-a916-9a68709958d6] with excutils.save_and_reraise_exception(): Apr 21 10:50:02 user nova-compute[70954]: ERROR nova.compute.manager [instance: db58ccab-48e9-499e-a916-9a68709958d6] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ Apr 21 10:50:02 user nova-compute[70954]: ERROR nova.compute.manager [instance: db58ccab-48e9-499e-a916-9a68709958d6] self.force_reraise() Apr 21 10:50:02 user nova-compute[70954]: ERROR nova.compute.manager [instance: db58ccab-48e9-499e-a916-9a68709958d6] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise Apr 21 10:50:02 user nova-compute[70954]: ERROR nova.compute.manager [instance: db58ccab-48e9-499e-a916-9a68709958d6] raise self.value Apr 21 10:50:02 user nova-compute[70954]: ERROR nova.compute.manager [instance: db58ccab-48e9-499e-a916-9a68709958d6] File "/opt/stack/nova/nova/virt/block_device.py", line 783, in _call_wait_func Apr 21 10:50:02 user nova-compute[70954]: ERROR nova.compute.manager [instance: db58ccab-48e9-499e-a916-9a68709958d6] wait_func(context, volume_id) Apr 21 10:50:02 user nova-compute[70954]: ERROR nova.compute.manager [instance: db58ccab-48e9-499e-a916-9a68709958d6] File "/opt/stack/nova/nova/compute/manager.py", line 1792, in _await_block_device_map_created Apr 21 10:50:02 user nova-compute[70954]: ERROR nova.compute.manager [instance: db58ccab-48e9-499e-a916-9a68709958d6] raise exception.VolumeNotCreated(volume_id=vol_id, Apr 21 10:50:02 user nova-compute[70954]: ERROR nova.compute.manager [instance: db58ccab-48e9-499e-a916-9a68709958d6] nova.exception.VolumeNotCreated: Volume b5cb6e0d-2d7b-4bd6-ab3a-e7e007ff8b31 did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. Apr 21 10:50:02 user nova-compute[70954]: ERROR nova.compute.manager [instance: db58ccab-48e9-499e-a916-9a68709958d6] Apr 21 10:50:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3dd95a26-8652-40f5-b357-3cbc8a38628a/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:50:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3dd95a26-8652-40f5-b357-3cbc8a38628a/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:50:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3dd95a26-8652-40f5-b357-3cbc8a38628a/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:50:02 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: db58ccab-48e9-499e-a916-9a68709958d6] Successfully created port: 8df080a4-e60b-4b0e-92c5-4059d9fc6e3b {{(pid=70954) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 10:50:02 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:02 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:50:02 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:50:02 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Hypervisor/Node resource view: name=user free_ram=8496MB free_disk=26.510887145996094GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70954) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 10:50:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:50:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:50:03 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 84b55fc0-e748-4c05-97ad-a6994c0487d2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:50:03 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance f8609da3-c26d-482a-bc03-017baf4bce22 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:50:03 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 15bf9321-a92e-4be2-bcae-a943988c811a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:50:03 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 8ae797bd-c587-43a3-b941-e6d6d6c74e51 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:50:03 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 3dd95a26-8652-40f5-b357-3cbc8a38628a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:50:03 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance db58ccab-48e9-499e-a916-9a68709958d6 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:50:03 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Total usable vcpus: 12, total allocated vcpus: 6 {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 10:50:03 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Final resource view: name=user phys_ram=16023MB used_ram=1280MB phys_disk=40GB used_disk=5GB total_vcpus=12 used_vcpus=6 pci_stats=[] {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 10:50:03 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:50:03 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:50:03 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Compute_service record updated for user:user {{(pid=70954) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 10:50:03 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.352s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:50:03 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: db58ccab-48e9-499e-a916-9a68709958d6] Successfully updated port: 8df080a4-e60b-4b0e-92c5-4059d9fc6e3b {{(pid=70954) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 10:50:03 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Acquiring lock "refresh_cache-db58ccab-48e9-499e-a916-9a68709958d6" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:50:03 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Acquired lock "refresh_cache-db58ccab-48e9-499e-a916-9a68709958d6" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:50:03 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: db58ccab-48e9-499e-a916-9a68709958d6] Building network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 10:50:03 user nova-compute[70954]: DEBUG nova.compute.manager [req-29bea600-7a0f-4fad-a1f7-100925bee254 req-76a70043-db9a-482d-a106-f8d276256492 service nova] [instance: db58ccab-48e9-499e-a916-9a68709958d6] Received event network-changed-8df080a4-e60b-4b0e-92c5-4059d9fc6e3b {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:50:03 user nova-compute[70954]: DEBUG nova.compute.manager [req-29bea600-7a0f-4fad-a1f7-100925bee254 req-76a70043-db9a-482d-a106-f8d276256492 service nova] [instance: db58ccab-48e9-499e-a916-9a68709958d6] Refreshing instance network info cache due to event network-changed-8df080a4-e60b-4b0e-92c5-4059d9fc6e3b. {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 10:50:03 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-29bea600-7a0f-4fad-a1f7-100925bee254 req-76a70043-db9a-482d-a106-f8d276256492 service nova] Acquiring lock "refresh_cache-db58ccab-48e9-499e-a916-9a68709958d6" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:50:03 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: db58ccab-48e9-499e-a916-9a68709958d6] Instance cache missing network info. {{(pid=70954) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 10:50:03 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: db58ccab-48e9-499e-a916-9a68709958d6] Updating instance_info_cache with network_info: [{"id": "8df080a4-e60b-4b0e-92c5-4059d9fc6e3b", "address": "fa:16:3e:ca:e0:02", "network": {"id": "cfb4de90-44ea-486a-b5c4-c3b1111aa2bd", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1667019531-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "aad84a0e014f47ddaeaddc88bf16b0a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8df080a4-e6", "ovs_interfaceid": "8df080a4-e60b-4b0e-92c5-4059d9fc6e3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:50:03 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Releasing lock "refresh_cache-db58ccab-48e9-499e-a916-9a68709958d6" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:50:03 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: db58ccab-48e9-499e-a916-9a68709958d6] Instance network_info: |[{"id": "8df080a4-e60b-4b0e-92c5-4059d9fc6e3b", "address": "fa:16:3e:ca:e0:02", "network": {"id": "cfb4de90-44ea-486a-b5c4-c3b1111aa2bd", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1667019531-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "aad84a0e014f47ddaeaddc88bf16b0a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8df080a4-e6", "ovs_interfaceid": "8df080a4-e60b-4b0e-92c5-4059d9fc6e3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 10:50:03 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-29bea600-7a0f-4fad-a1f7-100925bee254 req-76a70043-db9a-482d-a106-f8d276256492 service nova] Acquired lock "refresh_cache-db58ccab-48e9-499e-a916-9a68709958d6" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:50:03 user nova-compute[70954]: DEBUG nova.network.neutron [req-29bea600-7a0f-4fad-a1f7-100925bee254 req-76a70043-db9a-482d-a106-f8d276256492 service nova] [instance: db58ccab-48e9-499e-a916-9a68709958d6] Refreshing network info cache for port 8df080a4-e60b-4b0e-92c5-4059d9fc6e3b {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 10:50:03 user nova-compute[70954]: DEBUG nova.compute.claims [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: db58ccab-48e9-499e-a916-9a68709958d6] Aborting claim: {{(pid=70954) abort /opt/stack/nova/nova/compute/claims.py:84}} Apr 21 10:50:03 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:50:03 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:50:04 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:50:04 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:50:04 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.278s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:50:04 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: db58ccab-48e9-499e-a916-9a68709958d6] Build of instance db58ccab-48e9-499e-a916-9a68709958d6 aborted: Volume b5cb6e0d-2d7b-4bd6-ab3a-e7e007ff8b31 did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. {{(pid=70954) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2636}} Apr 21 10:50:04 user nova-compute[70954]: DEBUG nova.compute.utils [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: db58ccab-48e9-499e-a916-9a68709958d6] Build of instance db58ccab-48e9-499e-a916-9a68709958d6 aborted: Volume b5cb6e0d-2d7b-4bd6-ab3a-e7e007ff8b31 did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. {{(pid=70954) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} Apr 21 10:50:04 user nova-compute[70954]: ERROR nova.compute.manager [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: db58ccab-48e9-499e-a916-9a68709958d6] Build of instance db58ccab-48e9-499e-a916-9a68709958d6 aborted: Volume b5cb6e0d-2d7b-4bd6-ab3a-e7e007ff8b31 did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error.: nova.exception.BuildAbortException: Build of instance db58ccab-48e9-499e-a916-9a68709958d6 aborted: Volume b5cb6e0d-2d7b-4bd6-ab3a-e7e007ff8b31 did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. Apr 21 10:50:04 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: db58ccab-48e9-499e-a916-9a68709958d6] Unplugging VIFs for instance {{(pid=70954) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} Apr 21 10:50:04 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:50:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1063701166',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1063701166',id=10,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='aad84a0e014f47ddaeaddc88bf16b0a8',ramdisk_id='',reservation_id='r-xaxwg62u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-1980957418',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member'},tags=TagList,task_state='block_device_mapping',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:50:02Z,user_data=None,user_id='54c67d90b6014d9ea24ef2552006bc04',uuid=db58ccab-48e9-499e-a916-9a68709958d6,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8df080a4-e60b-4b0e-92c5-4059d9fc6e3b", "address": "fa:16:3e:ca:e0:02", "network": {"id": "cfb4de90-44ea-486a-b5c4-c3b1111aa2bd", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1667019531-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "aad84a0e014f47ddaeaddc88bf16b0a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8df080a4-e6", "ovs_interfaceid": "8df080a4-e60b-4b0e-92c5-4059d9fc6e3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 10:50:04 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Converting VIF {"id": "8df080a4-e60b-4b0e-92c5-4059d9fc6e3b", "address": "fa:16:3e:ca:e0:02", "network": {"id": "cfb4de90-44ea-486a-b5c4-c3b1111aa2bd", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1667019531-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "aad84a0e014f47ddaeaddc88bf16b0a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8df080a4-e6", "ovs_interfaceid": "8df080a4-e60b-4b0e-92c5-4059d9fc6e3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:50:04 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ca:e0:02,bridge_name='br-int',has_traffic_filtering=True,id=8df080a4-e60b-4b0e-92c5-4059d9fc6e3b,network=Network(cfb4de90-44ea-486a-b5c4-c3b1111aa2bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8df080a4-e6') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:50:04 user nova-compute[70954]: DEBUG os_vif [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ca:e0:02,bridge_name='br-int',has_traffic_filtering=True,id=8df080a4-e60b-4b0e-92c5-4059d9fc6e3b,network=Network(cfb4de90-44ea-486a-b5c4-c3b1111aa2bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8df080a4-e6') {{(pid=70954) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 10:50:04 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:04 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8df080a4-e6, bridge=br-int, if_exists=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:50:04 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 10:50:04 user nova-compute[70954]: INFO os_vif [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ca:e0:02,bridge_name='br-int',has_traffic_filtering=True,id=8df080a4-e60b-4b0e-92c5-4059d9fc6e3b,network=Network(cfb4de90-44ea-486a-b5c4-c3b1111aa2bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8df080a4-e6') Apr 21 10:50:04 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: db58ccab-48e9-499e-a916-9a68709958d6] Unplugged VIFs for instance {{(pid=70954) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} Apr 21 10:50:04 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: db58ccab-48e9-499e-a916-9a68709958d6] Deallocating network for instance {{(pid=70954) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 10:50:04 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: db58ccab-48e9-499e-a916-9a68709958d6] deallocate_for_instance() {{(pid=70954) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 10:50:04 user nova-compute[70954]: DEBUG nova.network.neutron [req-29bea600-7a0f-4fad-a1f7-100925bee254 req-76a70043-db9a-482d-a106-f8d276256492 service nova] [instance: db58ccab-48e9-499e-a916-9a68709958d6] Updated VIF entry in instance network info cache for port 8df080a4-e60b-4b0e-92c5-4059d9fc6e3b. {{(pid=70954) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 10:50:04 user nova-compute[70954]: DEBUG nova.network.neutron [req-29bea600-7a0f-4fad-a1f7-100925bee254 req-76a70043-db9a-482d-a106-f8d276256492 service nova] [instance: db58ccab-48e9-499e-a916-9a68709958d6] Updating instance_info_cache with network_info: [{"id": "8df080a4-e60b-4b0e-92c5-4059d9fc6e3b", "address": "fa:16:3e:ca:e0:02", "network": {"id": "cfb4de90-44ea-486a-b5c4-c3b1111aa2bd", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1667019531-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "aad84a0e014f47ddaeaddc88bf16b0a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8df080a4-e6", "ovs_interfaceid": "8df080a4-e60b-4b0e-92c5-4059d9fc6e3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:50:04 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-29bea600-7a0f-4fad-a1f7-100925bee254 req-76a70043-db9a-482d-a106-f8d276256492 service nova] Releasing lock "refresh_cache-db58ccab-48e9-499e-a916-9a68709958d6" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:50:04 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: db58ccab-48e9-499e-a916-9a68709958d6] Updating instance_info_cache with network_info: [] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:50:04 user nova-compute[70954]: INFO nova.compute.manager [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: db58ccab-48e9-499e-a916-9a68709958d6] Took 0.63 seconds to deallocate network for instance. Apr 21 10:50:05 user nova-compute[70954]: INFO nova.scheduler.client.report [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Deleted allocations for instance db58ccab-48e9-499e-a916-9a68709958d6 Apr 21 10:50:05 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d5f9eeb7-5367-4991-af65-0e08d21c6508 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "db58ccab-48e9-499e-a916-9a68709958d6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 3.798s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:50:05 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:06 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:50:06 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:50:06 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70954) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 10:50:07 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Acquiring lock "566040e7-8140-467b-b814-8d7eb62ef735" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:50:07 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Lock "566040e7-8140-467b-b814-8d7eb62ef735" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:50:07 user nova-compute[70954]: DEBUG nova.compute.manager [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Starting instance... {{(pid=70954) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 10:50:07 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:50:07 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:50:07 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70954) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 10:50:07 user nova-compute[70954]: INFO nova.compute.claims [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Claim successful on node user Apr 21 10:50:07 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:50:07 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:50:07 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.304s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:50:07 user nova-compute[70954]: DEBUG nova.compute.manager [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Start building networks asynchronously for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 10:50:07 user nova-compute[70954]: DEBUG nova.compute.manager [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Allocating IP information in the background. {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 10:50:07 user nova-compute[70954]: DEBUG nova.network.neutron [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] allocate_for_instance() {{(pid=70954) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 10:50:07 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 10:50:07 user nova-compute[70954]: DEBUG nova.compute.manager [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Start building block device mappings for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 10:50:07 user nova-compute[70954]: DEBUG nova.policy [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '25fb0d890b594080bb1bb99dd6294ff1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd85f51547e5244e495343281725fe320', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70954) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 10:50:07 user nova-compute[70954]: DEBUG nova.compute.manager [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Start spawning the instance on the hypervisor. {{(pid=70954) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 10:50:07 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Creating instance directory {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 10:50:07 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Creating image(s) Apr 21 10:50:07 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Acquiring lock "/opt/stack/data/nova/instances/566040e7-8140-467b-b814-8d7eb62ef735/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:50:07 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Lock "/opt/stack/data/nova/instances/566040e7-8140-467b-b814-8d7eb62ef735/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:50:07 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Lock "/opt/stack/data/nova/instances/566040e7-8140-467b-b814-8d7eb62ef735/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:50:07 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:50:07 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:07 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.152s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:50:07 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Acquiring lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:50:07 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:50:07 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:50:08 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.131s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:50:08 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/566040e7-8140-467b-b814-8d7eb62ef735/disk 1073741824 {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:50:08 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/566040e7-8140-467b-b814-8d7eb62ef735/disk 1073741824" returned: 0 in 0.057s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:50:08 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.196s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:50:08 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:50:08 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.137s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:50:08 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Checking if we can resize image /opt/stack/data/nova/instances/566040e7-8140-467b-b814-8d7eb62ef735/disk. size=1073741824 {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 10:50:08 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/566040e7-8140-467b-b814-8d7eb62ef735/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:50:08 user nova-compute[70954]: DEBUG nova.network.neutron [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Successfully created port: 69a7ef96-ead5-4890-a014-d86e90fa5050 {{(pid=70954) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 10:50:08 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/566040e7-8140-467b-b814-8d7eb62ef735/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:50:08 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Cannot resize image /opt/stack/data/nova/instances/566040e7-8140-467b-b814-8d7eb62ef735/disk to a smaller size. {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 10:50:08 user nova-compute[70954]: DEBUG nova.objects.instance [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Lazy-loading 'migration_context' on Instance uuid 566040e7-8140-467b-b814-8d7eb62ef735 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:50:08 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Created local disks {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 10:50:08 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Ensure instance console log exists: /opt/stack/data/nova/instances/566040e7-8140-467b-b814-8d7eb62ef735/console.log {{(pid=70954) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 10:50:08 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:50:08 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:50:08 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG nova.network.neutron [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Successfully updated port: 69a7ef96-ead5-4890-a014-d86e90fa5050 {{(pid=70954) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Acquiring lock "refresh_cache-566040e7-8140-467b-b814-8d7eb62ef735" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Acquired lock "refresh_cache-566040e7-8140-467b-b814-8d7eb62ef735" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG nova.network.neutron [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Building network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG nova.compute.manager [req-6c681de5-b08b-48c8-9f9f-11de0aa5bc5f req-5bcf55ed-c989-4ab1-9e68-4cea5891ae10 service nova] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Received event network-changed-69a7ef96-ead5-4890-a014-d86e90fa5050 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG nova.compute.manager [req-6c681de5-b08b-48c8-9f9f-11de0aa5bc5f req-5bcf55ed-c989-4ab1-9e68-4cea5891ae10 service nova] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Refreshing instance network info cache due to event network-changed-69a7ef96-ead5-4890-a014-d86e90fa5050. {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-6c681de5-b08b-48c8-9f9f-11de0aa5bc5f req-5bcf55ed-c989-4ab1-9e68-4cea5891ae10 service nova] Acquiring lock "refresh_cache-566040e7-8140-467b-b814-8d7eb62ef735" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG nova.network.neutron [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Instance cache missing network info. {{(pid=70954) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG nova.network.neutron [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Updating instance_info_cache with network_info: [{"id": "69a7ef96-ead5-4890-a014-d86e90fa5050", "address": "fa:16:3e:00:00:b3", "network": {"id": "b24b52ac-b8ab-493e-994c-c38d7c5c7089", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1354809025-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d85f51547e5244e495343281725fe320", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap69a7ef96-ea", "ovs_interfaceid": "69a7ef96-ead5-4890-a014-d86e90fa5050", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Releasing lock "refresh_cache-566040e7-8140-467b-b814-8d7eb62ef735" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG nova.compute.manager [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Instance network_info: |[{"id": "69a7ef96-ead5-4890-a014-d86e90fa5050", "address": "fa:16:3e:00:00:b3", "network": {"id": "b24b52ac-b8ab-493e-994c-c38d7c5c7089", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1354809025-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d85f51547e5244e495343281725fe320", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap69a7ef96-ea", "ovs_interfaceid": "69a7ef96-ead5-4890-a014-d86e90fa5050", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-6c681de5-b08b-48c8-9f9f-11de0aa5bc5f req-5bcf55ed-c989-4ab1-9e68-4cea5891ae10 service nova] Acquired lock "refresh_cache-566040e7-8140-467b-b814-8d7eb62ef735" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG nova.network.neutron [req-6c681de5-b08b-48c8-9f9f-11de0aa5bc5f req-5bcf55ed-c989-4ab1-9e68-4cea5891ae10 service nova] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Refreshing network info cache for port 69a7ef96-ead5-4890-a014-d86e90fa5050 {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Start _get_guest_xml network_info=[{"id": "69a7ef96-ead5-4890-a014-d86e90fa5050", "address": "fa:16:3e:00:00:b3", "network": {"id": "b24b52ac-b8ab-493e-994c-c38d7c5c7089", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1354809025-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d85f51547e5244e495343281725fe320", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap69a7ef96-ea", "ovs_interfaceid": "69a7ef96-ead5-4890-a014-d86e90fa5050", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:43:25Z,direct_url=,disk_format='qcow2',id=3b29a01a-1fc0-4d0d-89fb-23d22b2de02e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='a3109aa78f014d0da3638064a889676d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:43:26Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'size': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'disk_bus': 'virtio', 'encryption_format': None, 'guest_format': None, 'image_id': '3b29a01a-1fc0-4d0d-89fb-23d22b2de02e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 10:50:09 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:50:09 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:50:09 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70954) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T10:44:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:43:25Z,direct_url=,disk_format='qcow2',id=3b29a01a-1fc0-4d0d-89fb-23d22b2de02e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='a3109aa78f014d0da3638064a889676d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:43:26Z,virtual_size=,visibility=), allow threads: True {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Flavor limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Image limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Flavor pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Image pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Got 1 possible topologies {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:50:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-778341132',display_name='tempest-AttachVolumeTestJSON-server-778341132',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-778341132',id=11,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH4Me401imS8FOFYXCWEtUHYjBl+nNDFDOEp9F0qU3EcEjNmrofO3LBhufyQq8+T19fUEnsB1kP8hSrvs1kB/Y0kxUe6+elfuW9GrNxUHrtfrboj4/KWC2DfC017u1bvqA==',key_name='tempest-keypair-950390145',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d85f51547e5244e495343281725fe320',ramdisk_id='',reservation_id='r-jjr705pr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-2130575493',owner_user_name='tempest-AttachVolumeTestJSON-2130575493-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:50:08Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='25fb0d890b594080bb1bb99dd6294ff1',uuid=566040e7-8140-467b-b814-8d7eb62ef735,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "69a7ef96-ead5-4890-a014-d86e90fa5050", "address": "fa:16:3e:00:00:b3", "network": {"id": "b24b52ac-b8ab-493e-994c-c38d7c5c7089", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1354809025-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d85f51547e5244e495343281725fe320", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap69a7ef96-ea", "ovs_interfaceid": "69a7ef96-ead5-4890-a014-d86e90fa5050", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70954) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Converting VIF {"id": "69a7ef96-ead5-4890-a014-d86e90fa5050", "address": "fa:16:3e:00:00:b3", "network": {"id": "b24b52ac-b8ab-493e-994c-c38d7c5c7089", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1354809025-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d85f51547e5244e495343281725fe320", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap69a7ef96-ea", "ovs_interfaceid": "69a7ef96-ead5-4890-a014-d86e90fa5050", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:00:00:b3,bridge_name='br-int',has_traffic_filtering=True,id=69a7ef96-ead5-4890-a014-d86e90fa5050,network=Network(b24b52ac-b8ab-493e-994c-c38d7c5c7089),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69a7ef96-ea') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG nova.objects.instance [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Lazy-loading 'pci_devices' on Instance uuid 566040e7-8140-467b-b814-8d7eb62ef735 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] End _get_guest_xml xml= Apr 21 10:50:09 user nova-compute[70954]: 566040e7-8140-467b-b814-8d7eb62ef735 Apr 21 10:50:09 user nova-compute[70954]: instance-0000000b Apr 21 10:50:09 user nova-compute[70954]: 131072 Apr 21 10:50:09 user nova-compute[70954]: 1 Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: tempest-AttachVolumeTestJSON-server-778341132 Apr 21 10:50:09 user nova-compute[70954]: 2023-04-21 10:50:09 Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: 128 Apr 21 10:50:09 user nova-compute[70954]: 1 Apr 21 10:50:09 user nova-compute[70954]: 0 Apr 21 10:50:09 user nova-compute[70954]: 0 Apr 21 10:50:09 user nova-compute[70954]: 1 Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: tempest-AttachVolumeTestJSON-2130575493-project-member Apr 21 10:50:09 user nova-compute[70954]: tempest-AttachVolumeTestJSON-2130575493 Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: OpenStack Foundation Apr 21 10:50:09 user nova-compute[70954]: OpenStack Nova Apr 21 10:50:09 user nova-compute[70954]: 0.0.0 Apr 21 10:50:09 user nova-compute[70954]: 566040e7-8140-467b-b814-8d7eb62ef735 Apr 21 10:50:09 user nova-compute[70954]: 566040e7-8140-467b-b814-8d7eb62ef735 Apr 21 10:50:09 user nova-compute[70954]: Virtual Machine Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: hvm Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Nehalem Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: /dev/urandom Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: Apr 21 10:50:09 user nova-compute[70954]: {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:50:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-778341132',display_name='tempest-AttachVolumeTestJSON-server-778341132',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-778341132',id=11,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH4Me401imS8FOFYXCWEtUHYjBl+nNDFDOEp9F0qU3EcEjNmrofO3LBhufyQq8+T19fUEnsB1kP8hSrvs1kB/Y0kxUe6+elfuW9GrNxUHrtfrboj4/KWC2DfC017u1bvqA==',key_name='tempest-keypair-950390145',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d85f51547e5244e495343281725fe320',ramdisk_id='',reservation_id='r-jjr705pr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-2130575493',owner_user_name='tempest-AttachVolumeTestJSON-2130575493-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:50:08Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='25fb0d890b594080bb1bb99dd6294ff1',uuid=566040e7-8140-467b-b814-8d7eb62ef735,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "69a7ef96-ead5-4890-a014-d86e90fa5050", "address": "fa:16:3e:00:00:b3", "network": {"id": "b24b52ac-b8ab-493e-994c-c38d7c5c7089", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1354809025-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d85f51547e5244e495343281725fe320", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap69a7ef96-ea", "ovs_interfaceid": "69a7ef96-ead5-4890-a014-d86e90fa5050", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Converting VIF {"id": "69a7ef96-ead5-4890-a014-d86e90fa5050", "address": "fa:16:3e:00:00:b3", "network": {"id": "b24b52ac-b8ab-493e-994c-c38d7c5c7089", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1354809025-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d85f51547e5244e495343281725fe320", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap69a7ef96-ea", "ovs_interfaceid": "69a7ef96-ead5-4890-a014-d86e90fa5050", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:00:00:b3,bridge_name='br-int',has_traffic_filtering=True,id=69a7ef96-ead5-4890-a014-d86e90fa5050,network=Network(b24b52ac-b8ab-493e-994c-c38d7c5c7089),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69a7ef96-ea') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG os_vif [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:00:00:b3,bridge_name='br-int',has_traffic_filtering=True,id=69a7ef96-ead5-4890-a014-d86e90fa5050,network=Network(b24b52ac-b8ab-493e-994c-c38d7c5c7089),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69a7ef96-ea') {{(pid=70954) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap69a7ef96-ea, may_exist=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap69a7ef96-ea, col_values=(('external_ids', {'iface-id': '69a7ef96-ead5-4890-a014-d86e90fa5050', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:00:00:b3', 'vm-uuid': '566040e7-8140-467b-b814-8d7eb62ef735'}),)) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:09 user nova-compute[70954]: INFO os_vif [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:00:00:b3,bridge_name='br-int',has_traffic_filtering=True,id=69a7ef96-ead5-4890-a014-d86e90fa5050,network=Network(b24b52ac-b8ab-493e-994c-c38d7c5c7089),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69a7ef96-ea') Apr 21 10:50:09 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] No BDM found with device name vda, not building metadata. {{(pid=70954) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 10:50:09 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] No VIF found with MAC fa:16:3e:00:00:b3, not building metadata {{(pid=70954) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 10:50:10 user nova-compute[70954]: DEBUG nova.network.neutron [req-6c681de5-b08b-48c8-9f9f-11de0aa5bc5f req-5bcf55ed-c989-4ab1-9e68-4cea5891ae10 service nova] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Updated VIF entry in instance network info cache for port 69a7ef96-ead5-4890-a014-d86e90fa5050. {{(pid=70954) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 10:50:10 user nova-compute[70954]: DEBUG nova.network.neutron [req-6c681de5-b08b-48c8-9f9f-11de0aa5bc5f req-5bcf55ed-c989-4ab1-9e68-4cea5891ae10 service nova] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Updating instance_info_cache with network_info: [{"id": "69a7ef96-ead5-4890-a014-d86e90fa5050", "address": "fa:16:3e:00:00:b3", "network": {"id": "b24b52ac-b8ab-493e-994c-c38d7c5c7089", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1354809025-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d85f51547e5244e495343281725fe320", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap69a7ef96-ea", "ovs_interfaceid": "69a7ef96-ead5-4890-a014-d86e90fa5050", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:50:10 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-6c681de5-b08b-48c8-9f9f-11de0aa5bc5f req-5bcf55ed-c989-4ab1-9e68-4cea5891ae10 service nova] Releasing lock "refresh_cache-566040e7-8140-467b-b814-8d7eb62ef735" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:50:10 user nova-compute[70954]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:50:10 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] VM Stopped (Lifecycle Event) Apr 21 10:50:10 user nova-compute[70954]: DEBUG nova.compute.manager [None req-3cfb27b8-9b67-4de6-a3fb-5b36ce8ae565 None None] [instance: c3100b42-46b3-4371-89f2-e511ca1ce6cd] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:50:10 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:10 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:10 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:11 user nova-compute[70954]: DEBUG nova.compute.manager [req-9d443264-ad9a-4e85-b941-f123e584a239 req-d8eff175-2c03-4109-8516-2168e3096ce7 service nova] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Received event network-vif-plugged-69a7ef96-ead5-4890-a014-d86e90fa5050 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:50:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-9d443264-ad9a-4e85-b941-f123e584a239 req-d8eff175-2c03-4109-8516-2168e3096ce7 service nova] Acquiring lock "566040e7-8140-467b-b814-8d7eb62ef735-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:50:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-9d443264-ad9a-4e85-b941-f123e584a239 req-d8eff175-2c03-4109-8516-2168e3096ce7 service nova] Lock "566040e7-8140-467b-b814-8d7eb62ef735-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:50:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-9d443264-ad9a-4e85-b941-f123e584a239 req-d8eff175-2c03-4109-8516-2168e3096ce7 service nova] Lock "566040e7-8140-467b-b814-8d7eb62ef735-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:50:11 user nova-compute[70954]: DEBUG nova.compute.manager [req-9d443264-ad9a-4e85-b941-f123e584a239 req-d8eff175-2c03-4109-8516-2168e3096ce7 service nova] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] No waiting events found dispatching network-vif-plugged-69a7ef96-ead5-4890-a014-d86e90fa5050 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:50:11 user nova-compute[70954]: WARNING nova.compute.manager [req-9d443264-ad9a-4e85-b941-f123e584a239 req-d8eff175-2c03-4109-8516-2168e3096ce7 service nova] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Received unexpected event network-vif-plugged-69a7ef96-ead5-4890-a014-d86e90fa5050 for instance with vm_state building and task_state spawning. Apr 21 10:50:11 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:11 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:11 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:12 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:13 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Resumed> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:50:13 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] VM Resumed (Lifecycle Event) Apr 21 10:50:13 user nova-compute[70954]: DEBUG nova.compute.manager [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Instance event wait completed in 0 seconds for {{(pid=70954) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 10:50:13 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Guest created on hypervisor {{(pid=70954) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 10:50:13 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Instance spawned successfully. Apr 21 10:50:13 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 10:50:13 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:50:13 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:50:13 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Found default for hw_cdrom_bus of ide {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:50:13 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Found default for hw_disk_bus of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:50:13 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Found default for hw_input_bus of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:50:13 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Found default for hw_pointer_model of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:50:13 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Found default for hw_video_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:50:13 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Found default for hw_vif_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:50:13 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 10:50:13 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Started> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:50:13 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] VM Started (Lifecycle Event) Apr 21 10:50:13 user nova-compute[70954]: DEBUG nova.compute.manager [req-3ece06d3-0254-4803-bbfa-53e2886a0052 req-cb4fc0b5-0115-4f1f-a870-b56712f3c697 service nova] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Received event network-vif-plugged-69a7ef96-ead5-4890-a014-d86e90fa5050 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:50:13 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-3ece06d3-0254-4803-bbfa-53e2886a0052 req-cb4fc0b5-0115-4f1f-a870-b56712f3c697 service nova] Acquiring lock "566040e7-8140-467b-b814-8d7eb62ef735-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:50:13 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-3ece06d3-0254-4803-bbfa-53e2886a0052 req-cb4fc0b5-0115-4f1f-a870-b56712f3c697 service nova] Lock "566040e7-8140-467b-b814-8d7eb62ef735-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:50:13 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-3ece06d3-0254-4803-bbfa-53e2886a0052 req-cb4fc0b5-0115-4f1f-a870-b56712f3c697 service nova] Lock "566040e7-8140-467b-b814-8d7eb62ef735-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:50:13 user nova-compute[70954]: DEBUG nova.compute.manager [req-3ece06d3-0254-4803-bbfa-53e2886a0052 req-cb4fc0b5-0115-4f1f-a870-b56712f3c697 service nova] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] No waiting events found dispatching network-vif-plugged-69a7ef96-ead5-4890-a014-d86e90fa5050 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:50:13 user nova-compute[70954]: WARNING nova.compute.manager [req-3ece06d3-0254-4803-bbfa-53e2886a0052 req-cb4fc0b5-0115-4f1f-a870-b56712f3c697 service nova] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Received unexpected event network-vif-plugged-69a7ef96-ead5-4890-a014-d86e90fa5050 for instance with vm_state building and task_state spawning. Apr 21 10:50:13 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:50:13 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:50:13 user nova-compute[70954]: INFO nova.compute.manager [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Took 5.42 seconds to spawn the instance on the hypervisor. Apr 21 10:50:13 user nova-compute[70954]: DEBUG nova.compute.manager [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:50:13 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 10:50:13 user nova-compute[70954]: INFO nova.compute.manager [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Took 6.07 seconds to build instance. Apr 21 10:50:13 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-95bdc197-fce9-43d7-a48c-1aedf2e7c9a8 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Lock "566040e7-8140-467b-b814-8d7eb62ef735" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.163s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:50:14 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:14 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:16 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:17 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:18 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:18 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:19 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:22 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:22 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:24 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:25 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:27 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:29 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:32 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:33 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:34 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:36 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:39 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:42 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:44 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:47 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:49 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:50 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:52 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:52 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Acquiring lock "f4dda568-8f3b-40eb-aff3-64d3e759c310" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:50:52 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "f4dda568-8f3b-40eb-aff3-64d3e759c310" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:50:53 user nova-compute[70954]: DEBUG nova.compute.manager [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Starting instance... {{(pid=70954) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 10:50:53 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:50:53 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:50:53 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70954) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 10:50:53 user nova-compute[70954]: INFO nova.compute.claims [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Claim successful on node user Apr 21 10:50:53 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:50:53 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:50:53 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.400s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:50:53 user nova-compute[70954]: DEBUG nova.compute.manager [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Start building networks asynchronously for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 10:50:53 user nova-compute[70954]: DEBUG nova.compute.manager [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Allocating IP information in the background. {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 10:50:53 user nova-compute[70954]: DEBUG nova.network.neutron [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] allocate_for_instance() {{(pid=70954) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 10:50:53 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 10:50:53 user nova-compute[70954]: DEBUG nova.compute.manager [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Start building block device mappings for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 10:50:53 user nova-compute[70954]: DEBUG nova.policy [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '54c67d90b6014d9ea24ef2552006bc04', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'aad84a0e014f47ddaeaddc88bf16b0a8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70954) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 10:50:53 user nova-compute[70954]: DEBUG nova.compute.manager [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Start spawning the instance on the hypervisor. {{(pid=70954) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 10:50:53 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Creating instance directory {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 10:50:53 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Creating image(s) Apr 21 10:50:53 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Acquiring lock "/opt/stack/data/nova/instances/f4dda568-8f3b-40eb-aff3-64d3e759c310/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:50:53 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "/opt/stack/data/nova/instances/f4dda568-8f3b-40eb-aff3-64d3e759c310/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:50:53 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "/opt/stack/data/nova/instances/f4dda568-8f3b-40eb-aff3-64d3e759c310/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:50:53 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:50:53 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.141s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:50:53 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Acquiring lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:50:53 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:50:53 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:50:54 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.142s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:50:54 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/f4dda568-8f3b-40eb-aff3-64d3e759c310/disk 1073741824 {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:50:54 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/f4dda568-8f3b-40eb-aff3-64d3e759c310/disk 1073741824" returned: 0 in 0.049s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:50:54 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.198s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:50:54 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:50:54 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.150s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:50:54 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Checking if we can resize image /opt/stack/data/nova/instances/f4dda568-8f3b-40eb-aff3-64d3e759c310/disk. size=1073741824 {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 10:50:54 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f4dda568-8f3b-40eb-aff3-64d3e759c310/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:50:54 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f4dda568-8f3b-40eb-aff3-64d3e759c310/disk --force-share --output=json" returned: 0 in 0.158s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:50:54 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Cannot resize image /opt/stack/data/nova/instances/f4dda568-8f3b-40eb-aff3-64d3e759c310/disk to a smaller size. {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 10:50:54 user nova-compute[70954]: DEBUG nova.objects.instance [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lazy-loading 'migration_context' on Instance uuid f4dda568-8f3b-40eb-aff3-64d3e759c310 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:50:54 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Created local disks {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 10:50:54 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Ensure instance console log exists: /opt/stack/data/nova/instances/f4dda568-8f3b-40eb-aff3-64d3e759c310/console.log {{(pid=70954) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 10:50:54 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:50:54 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:50:54 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:50:54 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:54 user nova-compute[70954]: DEBUG nova.network.neutron [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Successfully created port: 0e9676a1-1652-48fd-affd-355632de3ca2 {{(pid=70954) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 10:50:55 user nova-compute[70954]: DEBUG nova.network.neutron [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Successfully updated port: 0e9676a1-1652-48fd-affd-355632de3ca2 {{(pid=70954) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 10:50:55 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Acquiring lock "refresh_cache-f4dda568-8f3b-40eb-aff3-64d3e759c310" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:50:55 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Acquired lock "refresh_cache-f4dda568-8f3b-40eb-aff3-64d3e759c310" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:50:55 user nova-compute[70954]: DEBUG nova.network.neutron [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Building network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 10:50:55 user nova-compute[70954]: DEBUG nova.compute.manager [req-645e122f-aab0-44ec-afd5-13fa533fa501 req-2c5660a9-9a42-4a01-8ec5-e2ac64678718 service nova] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Received event network-changed-0e9676a1-1652-48fd-affd-355632de3ca2 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:50:55 user nova-compute[70954]: DEBUG nova.compute.manager [req-645e122f-aab0-44ec-afd5-13fa533fa501 req-2c5660a9-9a42-4a01-8ec5-e2ac64678718 service nova] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Refreshing instance network info cache due to event network-changed-0e9676a1-1652-48fd-affd-355632de3ca2. {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 10:50:55 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-645e122f-aab0-44ec-afd5-13fa533fa501 req-2c5660a9-9a42-4a01-8ec5-e2ac64678718 service nova] Acquiring lock "refresh_cache-f4dda568-8f3b-40eb-aff3-64d3e759c310" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:50:55 user nova-compute[70954]: DEBUG nova.network.neutron [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Instance cache missing network info. {{(pid=70954) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 10:50:56 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:56 user nova-compute[70954]: DEBUG nova.network.neutron [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Updating instance_info_cache with network_info: [{"id": "0e9676a1-1652-48fd-affd-355632de3ca2", "address": "fa:16:3e:eb:77:eb", "network": {"id": "cfb4de90-44ea-486a-b5c4-c3b1111aa2bd", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1667019531-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "aad84a0e014f47ddaeaddc88bf16b0a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e9676a1-16", "ovs_interfaceid": "0e9676a1-1652-48fd-affd-355632de3ca2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:50:56 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Releasing lock "refresh_cache-f4dda568-8f3b-40eb-aff3-64d3e759c310" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:50:56 user nova-compute[70954]: DEBUG nova.compute.manager [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Instance network_info: |[{"id": "0e9676a1-1652-48fd-affd-355632de3ca2", "address": "fa:16:3e:eb:77:eb", "network": {"id": "cfb4de90-44ea-486a-b5c4-c3b1111aa2bd", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1667019531-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "aad84a0e014f47ddaeaddc88bf16b0a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e9676a1-16", "ovs_interfaceid": "0e9676a1-1652-48fd-affd-355632de3ca2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 10:50:56 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-645e122f-aab0-44ec-afd5-13fa533fa501 req-2c5660a9-9a42-4a01-8ec5-e2ac64678718 service nova] Acquired lock "refresh_cache-f4dda568-8f3b-40eb-aff3-64d3e759c310" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:50:56 user nova-compute[70954]: DEBUG nova.network.neutron [req-645e122f-aab0-44ec-afd5-13fa533fa501 req-2c5660a9-9a42-4a01-8ec5-e2ac64678718 service nova] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Refreshing network info cache for port 0e9676a1-1652-48fd-affd-355632de3ca2 {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 10:50:56 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Start _get_guest_xml network_info=[{"id": "0e9676a1-1652-48fd-affd-355632de3ca2", "address": "fa:16:3e:eb:77:eb", "network": {"id": "cfb4de90-44ea-486a-b5c4-c3b1111aa2bd", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1667019531-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "aad84a0e014f47ddaeaddc88bf16b0a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e9676a1-16", "ovs_interfaceid": "0e9676a1-1652-48fd-affd-355632de3ca2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:43:25Z,direct_url=,disk_format='qcow2',id=3b29a01a-1fc0-4d0d-89fb-23d22b2de02e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='a3109aa78f014d0da3638064a889676d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:43:26Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'size': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'disk_bus': 'virtio', 'encryption_format': None, 'guest_format': None, 'image_id': '3b29a01a-1fc0-4d0d-89fb-23d22b2de02e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 10:50:56 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:50:56 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:50:56 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70954) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 10:50:56 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T10:44:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:43:25Z,direct_url=,disk_format='qcow2',id=3b29a01a-1fc0-4d0d-89fb-23d22b2de02e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='a3109aa78f014d0da3638064a889676d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:43:26Z,virtual_size=,visibility=), allow threads: True {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 10:50:56 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Flavor limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 10:50:56 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Image limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 10:50:56 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Flavor pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 10:50:56 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Image pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 10:50:56 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 10:50:56 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 10:50:56 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 10:50:56 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Got 1 possible topologies {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 10:50:56 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 10:50:56 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 10:50:56 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:50:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-2043125688',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-2043125688',id=12,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='aad84a0e014f47ddaeaddc88bf16b0a8',ramdisk_id='',reservation_id='r-5710yc5p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-1980957418',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:50:54Z,user_data=None,user_id='54c67d90b6014d9ea24ef2552006bc04',uuid=f4dda568-8f3b-40eb-aff3-64d3e759c310,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0e9676a1-1652-48fd-affd-355632de3ca2", "address": "fa:16:3e:eb:77:eb", "network": {"id": "cfb4de90-44ea-486a-b5c4-c3b1111aa2bd", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1667019531-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "aad84a0e014f47ddaeaddc88bf16b0a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e9676a1-16", "ovs_interfaceid": "0e9676a1-1652-48fd-affd-355632de3ca2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70954) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 10:50:56 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Converting VIF {"id": "0e9676a1-1652-48fd-affd-355632de3ca2", "address": "fa:16:3e:eb:77:eb", "network": {"id": "cfb4de90-44ea-486a-b5c4-c3b1111aa2bd", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1667019531-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "aad84a0e014f47ddaeaddc88bf16b0a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e9676a1-16", "ovs_interfaceid": "0e9676a1-1652-48fd-affd-355632de3ca2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:50:56 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:77:eb,bridge_name='br-int',has_traffic_filtering=True,id=0e9676a1-1652-48fd-affd-355632de3ca2,network=Network(cfb4de90-44ea-486a-b5c4-c3b1111aa2bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e9676a1-16') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:50:56 user nova-compute[70954]: DEBUG nova.objects.instance [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lazy-loading 'pci_devices' on Instance uuid f4dda568-8f3b-40eb-aff3-64d3e759c310 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:50:56 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] End _get_guest_xml xml= Apr 21 10:50:56 user nova-compute[70954]: f4dda568-8f3b-40eb-aff3-64d3e759c310 Apr 21 10:50:56 user nova-compute[70954]: instance-0000000c Apr 21 10:50:56 user nova-compute[70954]: 131072 Apr 21 10:50:56 user nova-compute[70954]: 1 Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: tempest-ServerBootFromVolumeStableRescueTest-server-2043125688 Apr 21 10:50:56 user nova-compute[70954]: 2023-04-21 10:50:56 Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: 128 Apr 21 10:50:56 user nova-compute[70954]: 1 Apr 21 10:50:56 user nova-compute[70954]: 0 Apr 21 10:50:56 user nova-compute[70954]: 0 Apr 21 10:50:56 user nova-compute[70954]: 1 Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member Apr 21 10:50:56 user nova-compute[70954]: tempest-ServerBootFromVolumeStableRescueTest-1980957418 Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: OpenStack Foundation Apr 21 10:50:56 user nova-compute[70954]: OpenStack Nova Apr 21 10:50:56 user nova-compute[70954]: 0.0.0 Apr 21 10:50:56 user nova-compute[70954]: f4dda568-8f3b-40eb-aff3-64d3e759c310 Apr 21 10:50:56 user nova-compute[70954]: f4dda568-8f3b-40eb-aff3-64d3e759c310 Apr 21 10:50:56 user nova-compute[70954]: Virtual Machine Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: hvm Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Nehalem Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: /dev/urandom Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: Apr 21 10:50:56 user nova-compute[70954]: {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 10:50:56 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:50:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-2043125688',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-2043125688',id=12,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='aad84a0e014f47ddaeaddc88bf16b0a8',ramdisk_id='',reservation_id='r-5710yc5p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-1980957418',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:50:54Z,user_data=None,user_id='54c67d90b6014d9ea24ef2552006bc04',uuid=f4dda568-8f3b-40eb-aff3-64d3e759c310,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0e9676a1-1652-48fd-affd-355632de3ca2", "address": "fa:16:3e:eb:77:eb", "network": {"id": "cfb4de90-44ea-486a-b5c4-c3b1111aa2bd", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1667019531-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "aad84a0e014f47ddaeaddc88bf16b0a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e9676a1-16", "ovs_interfaceid": "0e9676a1-1652-48fd-affd-355632de3ca2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 10:50:56 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Converting VIF {"id": "0e9676a1-1652-48fd-affd-355632de3ca2", "address": "fa:16:3e:eb:77:eb", "network": {"id": "cfb4de90-44ea-486a-b5c4-c3b1111aa2bd", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1667019531-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "aad84a0e014f47ddaeaddc88bf16b0a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e9676a1-16", "ovs_interfaceid": "0e9676a1-1652-48fd-affd-355632de3ca2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:50:56 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:77:eb,bridge_name='br-int',has_traffic_filtering=True,id=0e9676a1-1652-48fd-affd-355632de3ca2,network=Network(cfb4de90-44ea-486a-b5c4-c3b1111aa2bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e9676a1-16') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:50:56 user nova-compute[70954]: DEBUG os_vif [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:77:eb,bridge_name='br-int',has_traffic_filtering=True,id=0e9676a1-1652-48fd-affd-355632de3ca2,network=Network(cfb4de90-44ea-486a-b5c4-c3b1111aa2bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e9676a1-16') {{(pid=70954) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 10:50:56 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:56 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:50:56 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 10:50:56 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:56 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0e9676a1-16, may_exist=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:50:56 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0e9676a1-16, col_values=(('external_ids', {'iface-id': '0e9676a1-1652-48fd-affd-355632de3ca2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:eb:77:eb', 'vm-uuid': 'f4dda568-8f3b-40eb-aff3-64d3e759c310'}),)) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:50:56 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:56 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:50:56 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:56 user nova-compute[70954]: INFO os_vif [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:77:eb,bridge_name='br-int',has_traffic_filtering=True,id=0e9676a1-1652-48fd-affd-355632de3ca2,network=Network(cfb4de90-44ea-486a-b5c4-c3b1111aa2bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e9676a1-16') Apr 21 10:50:56 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] No BDM found with device name vda, not building metadata. {{(pid=70954) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 10:50:56 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] No VIF found with MAC fa:16:3e:eb:77:eb, not building metadata {{(pid=70954) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 10:50:56 user nova-compute[70954]: DEBUG nova.network.neutron [req-645e122f-aab0-44ec-afd5-13fa533fa501 req-2c5660a9-9a42-4a01-8ec5-e2ac64678718 service nova] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Updated VIF entry in instance network info cache for port 0e9676a1-1652-48fd-affd-355632de3ca2. {{(pid=70954) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 10:50:56 user nova-compute[70954]: DEBUG nova.network.neutron [req-645e122f-aab0-44ec-afd5-13fa533fa501 req-2c5660a9-9a42-4a01-8ec5-e2ac64678718 service nova] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Updating instance_info_cache with network_info: [{"id": "0e9676a1-1652-48fd-affd-355632de3ca2", "address": "fa:16:3e:eb:77:eb", "network": {"id": "cfb4de90-44ea-486a-b5c4-c3b1111aa2bd", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1667019531-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "aad84a0e014f47ddaeaddc88bf16b0a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e9676a1-16", "ovs_interfaceid": "0e9676a1-1652-48fd-affd-355632de3ca2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:50:56 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-645e122f-aab0-44ec-afd5-13fa533fa501 req-2c5660a9-9a42-4a01-8ec5-e2ac64678718 service nova] Releasing lock "refresh_cache-f4dda568-8f3b-40eb-aff3-64d3e759c310" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:50:57 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:57 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:57 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:57 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:57 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:57 user nova-compute[70954]: DEBUG nova.compute.manager [req-5ba7148b-27a3-4039-817a-31f1e0611969 req-e9ae87da-501d-41ab-b656-09e3d49e1b2d service nova] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Received event network-vif-plugged-0e9676a1-1652-48fd-affd-355632de3ca2 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:50:57 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-5ba7148b-27a3-4039-817a-31f1e0611969 req-e9ae87da-501d-41ab-b656-09e3d49e1b2d service nova] Acquiring lock "f4dda568-8f3b-40eb-aff3-64d3e759c310-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:50:57 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-5ba7148b-27a3-4039-817a-31f1e0611969 req-e9ae87da-501d-41ab-b656-09e3d49e1b2d service nova] Lock "f4dda568-8f3b-40eb-aff3-64d3e759c310-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:50:57 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-5ba7148b-27a3-4039-817a-31f1e0611969 req-e9ae87da-501d-41ab-b656-09e3d49e1b2d service nova] Lock "f4dda568-8f3b-40eb-aff3-64d3e759c310-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:50:57 user nova-compute[70954]: DEBUG nova.compute.manager [req-5ba7148b-27a3-4039-817a-31f1e0611969 req-e9ae87da-501d-41ab-b656-09e3d49e1b2d service nova] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] No waiting events found dispatching network-vif-plugged-0e9676a1-1652-48fd-affd-355632de3ca2 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:50:57 user nova-compute[70954]: WARNING nova.compute.manager [req-5ba7148b-27a3-4039-817a-31f1e0611969 req-e9ae87da-501d-41ab-b656-09e3d49e1b2d service nova] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Received unexpected event network-vif-plugged-0e9676a1-1652-48fd-affd-355632de3ca2 for instance with vm_state building and task_state spawning. Apr 21 10:50:57 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:57 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:59 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Resumed> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:50:59 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] VM Resumed (Lifecycle Event) Apr 21 10:50:59 user nova-compute[70954]: DEBUG nova.compute.manager [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Instance event wait completed in 0 seconds for {{(pid=70954) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 10:50:59 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Guest created on hypervisor {{(pid=70954) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 10:50:59 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Instance spawned successfully. Apr 21 10:50:59 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 10:50:59 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:50:59 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:50:59 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:50:59 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Found default for hw_cdrom_bus of ide {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:50:59 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Found default for hw_disk_bus of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:50:59 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Found default for hw_input_bus of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:50:59 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Found default for hw_pointer_model of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:50:59 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Found default for hw_video_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:50:59 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Found default for hw_vif_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:50:59 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 10:50:59 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Started> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:50:59 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] VM Started (Lifecycle Event) Apr 21 10:50:59 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:50:59 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:50:59 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 10:50:59 user nova-compute[70954]: INFO nova.compute.manager [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Took 6.03 seconds to spawn the instance on the hypervisor. Apr 21 10:50:59 user nova-compute[70954]: DEBUG nova.compute.manager [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:50:59 user nova-compute[70954]: DEBUG nova.compute.manager [req-1d214b45-44fb-408f-aac9-1eea8ee67b6b req-e71ba1da-03cc-4341-9356-d2f9edc9f720 service nova] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Received event network-vif-plugged-0e9676a1-1652-48fd-affd-355632de3ca2 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:50:59 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-1d214b45-44fb-408f-aac9-1eea8ee67b6b req-e71ba1da-03cc-4341-9356-d2f9edc9f720 service nova] Acquiring lock "f4dda568-8f3b-40eb-aff3-64d3e759c310-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:50:59 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-1d214b45-44fb-408f-aac9-1eea8ee67b6b req-e71ba1da-03cc-4341-9356-d2f9edc9f720 service nova] Lock "f4dda568-8f3b-40eb-aff3-64d3e759c310-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:50:59 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-1d214b45-44fb-408f-aac9-1eea8ee67b6b req-e71ba1da-03cc-4341-9356-d2f9edc9f720 service nova] Lock "f4dda568-8f3b-40eb-aff3-64d3e759c310-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:50:59 user nova-compute[70954]: DEBUG nova.compute.manager [req-1d214b45-44fb-408f-aac9-1eea8ee67b6b req-e71ba1da-03cc-4341-9356-d2f9edc9f720 service nova] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] No waiting events found dispatching network-vif-plugged-0e9676a1-1652-48fd-affd-355632de3ca2 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:50:59 user nova-compute[70954]: WARNING nova.compute.manager [req-1d214b45-44fb-408f-aac9-1eea8ee67b6b req-e71ba1da-03cc-4341-9356-d2f9edc9f720 service nova] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Received unexpected event network-vif-plugged-0e9676a1-1652-48fd-affd-355632de3ca2 for instance with vm_state building and task_state spawning. Apr 21 10:50:59 user nova-compute[70954]: INFO nova.compute.manager [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Took 6.75 seconds to build instance. Apr 21 10:50:59 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:50:59 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f42bea43-f64c-4884-a48e-8452c8825d79 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "f4dda568-8f3b-40eb-aff3-64d3e759c310" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.863s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:50:59 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:50:59 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Starting heal instance info cache {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 10:50:59 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "refresh_cache-15bf9321-a92e-4be2-bcae-a943988c811a" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:50:59 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquired lock "refresh_cache-15bf9321-a92e-4be2-bcae-a943988c811a" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:50:59 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Forcefully refreshing network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 21 10:51:00 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Updating instance_info_cache with network_info: [{"id": "fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d", "address": "fa:16:3e:60:6e:bf", "network": {"id": "ba9d5253-efcc-4b0a-8cda-778a5a337551", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-310377863-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "4bdd7a4ccfc340aa9c1b02c57f7a0e70", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfca8b6a6-fd", "ovs_interfaceid": "fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:51:00 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Releasing lock "refresh_cache-15bf9321-a92e-4be2-bcae-a943988c811a" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:51:00 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Updated the network info_cache for instance {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 21 10:51:00 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:51:00 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:51:00 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:51:00 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager.update_available_resource {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:51:00 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:51:00 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:51:00 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:51:00 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Auditing locally available compute resources for user (node: user) {{(pid=70954) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 10:51:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15bf9321-a92e-4be2-bcae-a943988c811a/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:51:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15bf9321-a92e-4be2-bcae-a943988c811a/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:51:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15bf9321-a92e-4be2-bcae-a943988c811a/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:51:01 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15bf9321-a92e-4be2-bcae-a943988c811a/disk --force-share --output=json" returned: 0 in 0.147s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:51:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/566040e7-8140-467b-b814-8d7eb62ef735/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:51:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/566040e7-8140-467b-b814-8d7eb62ef735/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:51:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/566040e7-8140-467b-b814-8d7eb62ef735/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:51:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/566040e7-8140-467b-b814-8d7eb62ef735/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:51:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:51:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk --force-share --output=json" returned: 0 in 0.145s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:51:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:51:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk --force-share --output=json" returned: 0 in 0.151s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:51:01 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:51:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:51:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:51:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:51:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8ae797bd-c587-43a3-b941-e6d6d6c74e51/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:51:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8ae797bd-c587-43a3-b941-e6d6d6c74e51/disk --force-share --output=json" returned: 0 in 0.145s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:51:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8ae797bd-c587-43a3-b941-e6d6d6c74e51/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:51:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8ae797bd-c587-43a3-b941-e6d6d6c74e51/disk --force-share --output=json" returned: 0 in 0.145s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:51:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3dd95a26-8652-40f5-b357-3cbc8a38628a/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:51:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3dd95a26-8652-40f5-b357-3cbc8a38628a/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:51:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3dd95a26-8652-40f5-b357-3cbc8a38628a/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:51:02 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3dd95a26-8652-40f5-b357-3cbc8a38628a/disk --force-share --output=json" returned: 0 in 0.148s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:51:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f4dda568-8f3b-40eb-aff3-64d3e759c310/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:51:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f4dda568-8f3b-40eb-aff3-64d3e759c310/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:51:03 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f4dda568-8f3b-40eb-aff3-64d3e759c310/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:51:03 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f4dda568-8f3b-40eb-aff3-64d3e759c310/disk --force-share --output=json" returned: 0 in 0.148s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:51:03 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:51:03 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:51:03 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Hypervisor/Node resource view: name=user free_ram=8518MB free_disk=26.47954559326172GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70954) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 10:51:03 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:51:03 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:51:03 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 84b55fc0-e748-4c05-97ad-a6994c0487d2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:51:03 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance f8609da3-c26d-482a-bc03-017baf4bce22 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:51:03 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 15bf9321-a92e-4be2-bcae-a943988c811a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:51:03 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 8ae797bd-c587-43a3-b941-e6d6d6c74e51 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:51:03 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 3dd95a26-8652-40f5-b357-3cbc8a38628a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:51:03 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 566040e7-8140-467b-b814-8d7eb62ef735 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:51:03 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance f4dda568-8f3b-40eb-aff3-64d3e759c310 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:51:03 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Total usable vcpus: 12, total allocated vcpus: 7 {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 10:51:03 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Final resource view: name=user phys_ram=16023MB used_ram=1408MB phys_disk=40GB used_disk=7GB total_vcpus=12 used_vcpus=7 pci_stats=[] {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 10:51:03 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:03 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:51:03 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:51:04 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Compute_service record updated for user:user {{(pid=70954) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 10:51:04 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.395s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:51:05 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:05 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-35f7c72e-f266-4645-8552-e324620d5993 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Acquiring lock "15bf9321-a92e-4be2-bcae-a943988c811a" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:51:05 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-35f7c72e-f266-4645-8552-e324620d5993 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Lock "15bf9321-a92e-4be2-bcae-a943988c811a" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:51:05 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-35f7c72e-f266-4645-8552-e324620d5993 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Acquiring lock "15bf9321-a92e-4be2-bcae-a943988c811a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:51:05 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-35f7c72e-f266-4645-8552-e324620d5993 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Lock "15bf9321-a92e-4be2-bcae-a943988c811a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:51:05 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-35f7c72e-f266-4645-8552-e324620d5993 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Lock "15bf9321-a92e-4be2-bcae-a943988c811a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:51:05 user nova-compute[70954]: INFO nova.compute.manager [None req-35f7c72e-f266-4645-8552-e324620d5993 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Terminating instance Apr 21 10:51:05 user nova-compute[70954]: DEBUG nova.compute.manager [None req-35f7c72e-f266-4645-8552-e324620d5993 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Start destroying the instance on the hypervisor. {{(pid=70954) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 10:51:05 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:06 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:51:06 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:51:06 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:51:06 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:06 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:06 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:06 user nova-compute[70954]: DEBUG nova.compute.manager [req-2098cd12-8a39-404f-bdc9-f1d8bf86d4eb req-2ee0ff50-711b-43fb-ba95-ebe03ea3f764 service nova] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Received event network-vif-unplugged-fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:51:06 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-2098cd12-8a39-404f-bdc9-f1d8bf86d4eb req-2ee0ff50-711b-43fb-ba95-ebe03ea3f764 service nova] Acquiring lock "15bf9321-a92e-4be2-bcae-a943988c811a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:51:06 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-2098cd12-8a39-404f-bdc9-f1d8bf86d4eb req-2ee0ff50-711b-43fb-ba95-ebe03ea3f764 service nova] Lock "15bf9321-a92e-4be2-bcae-a943988c811a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:51:06 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-2098cd12-8a39-404f-bdc9-f1d8bf86d4eb req-2ee0ff50-711b-43fb-ba95-ebe03ea3f764 service nova] Lock "15bf9321-a92e-4be2-bcae-a943988c811a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:51:06 user nova-compute[70954]: DEBUG nova.compute.manager [req-2098cd12-8a39-404f-bdc9-f1d8bf86d4eb req-2ee0ff50-711b-43fb-ba95-ebe03ea3f764 service nova] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] No waiting events found dispatching network-vif-unplugged-fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:51:06 user nova-compute[70954]: DEBUG nova.compute.manager [req-2098cd12-8a39-404f-bdc9-f1d8bf86d4eb req-2ee0ff50-711b-43fb-ba95-ebe03ea3f764 service nova] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Received event network-vif-unplugged-fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d for instance with task_state deleting. {{(pid=70954) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 10:51:06 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:06 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Instance destroyed successfully. Apr 21 10:51:06 user nova-compute[70954]: DEBUG nova.objects.instance [None req-35f7c72e-f266-4645-8552-e324620d5993 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Lazy-loading 'resources' on Instance uuid 15bf9321-a92e-4be2-bcae-a943988c811a {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:51:06 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-35f7c72e-f266-4645-8552-e324620d5993 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:47:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-830796194',display_name='tempest-ServerStableDeviceRescueTest-server-830796194',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverstabledevicerescuetest-server-830796194',id=4,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHOLDYACefR6pJbo/1FNlJcU0uVLNhwHibyQxq6uGxw58mBzlostFRjitCW5kqYo4/rT+TGHwIPAMOKYgrhYN17TXx7fyo6rQDJa7QLpDa2shAHPXuXXSRjnzvc+xkQMdw==',key_name='tempest-keypair-474843765',keypairs=,launch_index=0,launched_at=2023-04-21T10:47:37Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='4bdd7a4ccfc340aa9c1b02c57f7a0e70',ramdisk_id='',reservation_id='r-4605i77j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerStableDeviceRescueTest-335595160',owner_user_name='tempest-ServerStableDeviceRescueTest-335595160-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T10:49:18Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d172d648a9474db082646a47a2840214',uuid=15bf9321-a92e-4be2-bcae-a943988c811a,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d", "address": "fa:16:3e:60:6e:bf", "network": {"id": "ba9d5253-efcc-4b0a-8cda-778a5a337551", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-310377863-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "4bdd7a4ccfc340aa9c1b02c57f7a0e70", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfca8b6a6-fd", "ovs_interfaceid": "fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 10:51:06 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-35f7c72e-f266-4645-8552-e324620d5993 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Converting VIF {"id": "fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d", "address": "fa:16:3e:60:6e:bf", "network": {"id": "ba9d5253-efcc-4b0a-8cda-778a5a337551", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-310377863-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "4bdd7a4ccfc340aa9c1b02c57f7a0e70", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfca8b6a6-fd", "ovs_interfaceid": "fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:51:06 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-35f7c72e-f266-4645-8552-e324620d5993 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:60:6e:bf,bridge_name='br-int',has_traffic_filtering=True,id=fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d,network=Network(ba9d5253-efcc-4b0a-8cda-778a5a337551),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfca8b6a6-fd') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:51:06 user nova-compute[70954]: DEBUG os_vif [None req-35f7c72e-f266-4645-8552-e324620d5993 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:60:6e:bf,bridge_name='br-int',has_traffic_filtering=True,id=fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d,network=Network(ba9d5253-efcc-4b0a-8cda-778a5a337551),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfca8b6a6-fd') {{(pid=70954) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 10:51:06 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:06 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfca8b6a6-fd, bridge=br-int, if_exists=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:51:06 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:06 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:51:06 user nova-compute[70954]: INFO os_vif [None req-35f7c72e-f266-4645-8552-e324620d5993 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:60:6e:bf,bridge_name='br-int',has_traffic_filtering=True,id=fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d,network=Network(ba9d5253-efcc-4b0a-8cda-778a5a337551),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfca8b6a6-fd') Apr 21 10:51:06 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-35f7c72e-f266-4645-8552-e324620d5993 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Deleting instance files /opt/stack/data/nova/instances/15bf9321-a92e-4be2-bcae-a943988c811a_del Apr 21 10:51:06 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-35f7c72e-f266-4645-8552-e324620d5993 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Deletion of /opt/stack/data/nova/instances/15bf9321-a92e-4be2-bcae-a943988c811a_del complete Apr 21 10:51:06 user nova-compute[70954]: INFO nova.compute.manager [None req-35f7c72e-f266-4645-8552-e324620d5993 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Took 0.93 seconds to destroy the instance on the hypervisor. Apr 21 10:51:06 user nova-compute[70954]: DEBUG oslo.service.loopingcall [None req-35f7c72e-f266-4645-8552-e324620d5993 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70954) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 10:51:06 user nova-compute[70954]: DEBUG nova.compute.manager [-] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Deallocating network for instance {{(pid=70954) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 10:51:06 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] deallocate_for_instance() {{(pid=70954) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 10:51:07 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Updating instance_info_cache with network_info: [] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:51:07 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Took 0.89 seconds to deallocate network for instance. Apr 21 10:51:07 user nova-compute[70954]: DEBUG nova.compute.manager [req-dd339c0e-1d53-4d0f-a17c-5b9feaba5258 req-4c5013be-c23c-49d5-a39c-c1b1be4160c4 service nova] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Received event network-vif-deleted-fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:51:07 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-35f7c72e-f266-4645-8552-e324620d5993 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:51:07 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-35f7c72e-f266-4645-8552-e324620d5993 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:51:07 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:07 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:51:07 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70954) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 10:51:07 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-35f7c72e-f266-4645-8552-e324620d5993 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:51:08 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-35f7c72e-f266-4645-8552-e324620d5993 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:51:08 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-35f7c72e-f266-4645-8552-e324620d5993 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.240s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:51:08 user nova-compute[70954]: INFO nova.scheduler.client.report [None req-35f7c72e-f266-4645-8552-e324620d5993 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Deleted allocations for instance 15bf9321-a92e-4be2-bcae-a943988c811a Apr 21 10:51:08 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-35f7c72e-f266-4645-8552-e324620d5993 tempest-ServerStableDeviceRescueTest-335595160 tempest-ServerStableDeviceRescueTest-335595160-project-member] Lock "15bf9321-a92e-4be2-bcae-a943988c811a" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.247s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:51:08 user nova-compute[70954]: DEBUG nova.compute.manager [req-d84d8081-de08-4f85-be1c-a2ee7323d2a6 req-24585973-2b1d-4677-9c35-0eace5cf7f23 service nova] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Received event network-vif-plugged-fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:51:08 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-d84d8081-de08-4f85-be1c-a2ee7323d2a6 req-24585973-2b1d-4677-9c35-0eace5cf7f23 service nova] Acquiring lock "15bf9321-a92e-4be2-bcae-a943988c811a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:51:08 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-d84d8081-de08-4f85-be1c-a2ee7323d2a6 req-24585973-2b1d-4677-9c35-0eace5cf7f23 service nova] Lock "15bf9321-a92e-4be2-bcae-a943988c811a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:51:08 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-d84d8081-de08-4f85-be1c-a2ee7323d2a6 req-24585973-2b1d-4677-9c35-0eace5cf7f23 service nova] Lock "15bf9321-a92e-4be2-bcae-a943988c811a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:51:08 user nova-compute[70954]: DEBUG nova.compute.manager [req-d84d8081-de08-4f85-be1c-a2ee7323d2a6 req-24585973-2b1d-4677-9c35-0eace5cf7f23 service nova] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] No waiting events found dispatching network-vif-plugged-fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:51:08 user nova-compute[70954]: WARNING nova.compute.manager [req-d84d8081-de08-4f85-be1c-a2ee7323d2a6 req-24585973-2b1d-4677-9c35-0eace5cf7f23 service nova] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Received unexpected event network-vif-plugged-fca8b6a6-fd45-4ba5-b1b1-fc40da2fc33d for instance with vm_state deleted and task_state None. Apr 21 10:51:09 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Acquiring lock "69031436-19d1-4cc1-91e7-4d99381b6ae3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:51:09 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Lock "69031436-19d1-4cc1-91e7-4d99381b6ae3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:51:09 user nova-compute[70954]: DEBUG nova.compute.manager [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Starting instance... {{(pid=70954) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 10:51:09 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:51:09 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:51:09 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70954) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 10:51:09 user nova-compute[70954]: INFO nova.compute.claims [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Claim successful on node user Apr 21 10:51:09 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:51:09 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:51:09 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.347s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:51:09 user nova-compute[70954]: DEBUG nova.compute.manager [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Start building networks asynchronously for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 10:51:09 user nova-compute[70954]: DEBUG nova.compute.manager [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Allocating IP information in the background. {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 10:51:09 user nova-compute[70954]: DEBUG nova.network.neutron [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] allocate_for_instance() {{(pid=70954) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 10:51:09 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 10:51:09 user nova-compute[70954]: DEBUG nova.compute.manager [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Start building block device mappings for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 10:51:09 user nova-compute[70954]: DEBUG nova.policy [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6cae5a1734d24ac8aebc233dd31d3084', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9ead44a7da0640cbb2cf8dece0ea4f40', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70954) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 10:51:09 user nova-compute[70954]: DEBUG nova.compute.manager [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Start spawning the instance on the hypervisor. {{(pid=70954) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 10:51:09 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Creating instance directory {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 10:51:09 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Creating image(s) Apr 21 10:51:09 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Acquiring lock "/opt/stack/data/nova/instances/69031436-19d1-4cc1-91e7-4d99381b6ae3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:51:09 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Lock "/opt/stack/data/nova/instances/69031436-19d1-4cc1-91e7-4d99381b6ae3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:51:09 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Lock "/opt/stack/data/nova/instances/69031436-19d1-4cc1-91e7-4d99381b6ae3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:51:09 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Acquiring lock "f1c9ded4da53e78155bd837a12b25dfdaa1b7238" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:51:09 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Lock "f1c9ded4da53e78155bd837a12b25dfdaa1b7238" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:51:10 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/f1c9ded4da53e78155bd837a12b25dfdaa1b7238.part --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:51:10 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/f1c9ded4da53e78155bd837a12b25dfdaa1b7238.part --force-share --output=json" returned: 0 in 0.131s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:51:10 user nova-compute[70954]: DEBUG nova.virt.images [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] 86c32514-9140-48a4-8ce6-baeafcca9587 was qcow2, converting to raw {{(pid=70954) fetch_to_raw /opt/stack/nova/nova/virt/images.py:165}} Apr 21 10:51:10 user nova-compute[70954]: DEBUG nova.privsep.utils [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=70954) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 21 10:51:10 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/f1c9ded4da53e78155bd837a12b25dfdaa1b7238.part /opt/stack/data/nova/instances/_base/f1c9ded4da53e78155bd837a12b25dfdaa1b7238.converted {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:51:10 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/f1c9ded4da53e78155bd837a12b25dfdaa1b7238.part /opt/stack/data/nova/instances/_base/f1c9ded4da53e78155bd837a12b25dfdaa1b7238.converted" returned: 0 in 0.121s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:51:10 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/f1c9ded4da53e78155bd837a12b25dfdaa1b7238.converted --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:51:10 user nova-compute[70954]: DEBUG nova.network.neutron [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Successfully created port: 516728bc-fbfa-4318-bbac-6c94509ee008 {{(pid=70954) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 10:51:11 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/f1c9ded4da53e78155bd837a12b25dfdaa1b7238.converted --force-share --output=json" returned: 0 in 0.166s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:51:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Lock "f1c9ded4da53e78155bd837a12b25dfdaa1b7238" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 1.070s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:51:11 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/f1c9ded4da53e78155bd837a12b25dfdaa1b7238 --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:51:11 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/f1c9ded4da53e78155bd837a12b25dfdaa1b7238 --force-share --output=json" returned: 0 in 0.158s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:51:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Acquiring lock "f1c9ded4da53e78155bd837a12b25dfdaa1b7238" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:51:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Lock "f1c9ded4da53e78155bd837a12b25dfdaa1b7238" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:51:11 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/f1c9ded4da53e78155bd837a12b25dfdaa1b7238 --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:51:11 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/f1c9ded4da53e78155bd837a12b25dfdaa1b7238 --force-share --output=json" returned: 0 in 0.129s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:51:11 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/f1c9ded4da53e78155bd837a12b25dfdaa1b7238,backing_fmt=raw /opt/stack/data/nova/instances/69031436-19d1-4cc1-91e7-4d99381b6ae3/disk 1073741824 {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:51:11 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/f1c9ded4da53e78155bd837a12b25dfdaa1b7238,backing_fmt=raw /opt/stack/data/nova/instances/69031436-19d1-4cc1-91e7-4d99381b6ae3/disk 1073741824" returned: 0 in 0.047s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:51:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Lock "f1c9ded4da53e78155bd837a12b25dfdaa1b7238" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.182s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:51:11 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/f1c9ded4da53e78155bd837a12b25dfdaa1b7238 --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:51:11 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/f1c9ded4da53e78155bd837a12b25dfdaa1b7238 --force-share --output=json" returned: 0 in 0.158s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:51:11 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Checking if we can resize image /opt/stack/data/nova/instances/69031436-19d1-4cc1-91e7-4d99381b6ae3/disk. size=1073741824 {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 10:51:11 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/69031436-19d1-4cc1-91e7-4d99381b6ae3/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:51:11 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/69031436-19d1-4cc1-91e7-4d99381b6ae3/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:51:11 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Cannot resize image /opt/stack/data/nova/instances/69031436-19d1-4cc1-91e7-4d99381b6ae3/disk to a smaller size. {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 10:51:11 user nova-compute[70954]: DEBUG nova.objects.instance [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Lazy-loading 'migration_context' on Instance uuid 69031436-19d1-4cc1-91e7-4d99381b6ae3 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:51:11 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Created local disks {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 10:51:11 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Ensure instance console log exists: /opt/stack/data/nova/instances/69031436-19d1-4cc1-91e7-4d99381b6ae3/console.log {{(pid=70954) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 10:51:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:51:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:51:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:51:11 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:11 user nova-compute[70954]: DEBUG nova.network.neutron [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Successfully updated port: 516728bc-fbfa-4318-bbac-6c94509ee008 {{(pid=70954) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 10:51:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Acquiring lock "refresh_cache-69031436-19d1-4cc1-91e7-4d99381b6ae3" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:51:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Acquired lock "refresh_cache-69031436-19d1-4cc1-91e7-4d99381b6ae3" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:51:11 user nova-compute[70954]: DEBUG nova.network.neutron [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Building network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 10:51:11 user nova-compute[70954]: DEBUG nova.compute.manager [req-2d5e49ba-3080-484a-bde9-020b497f0d6d req-23eacc69-f06e-429b-8c08-5e1739ea5d69 service nova] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Received event network-changed-516728bc-fbfa-4318-bbac-6c94509ee008 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:51:11 user nova-compute[70954]: DEBUG nova.compute.manager [req-2d5e49ba-3080-484a-bde9-020b497f0d6d req-23eacc69-f06e-429b-8c08-5e1739ea5d69 service nova] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Refreshing instance network info cache due to event network-changed-516728bc-fbfa-4318-bbac-6c94509ee008. {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 10:51:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-2d5e49ba-3080-484a-bde9-020b497f0d6d req-23eacc69-f06e-429b-8c08-5e1739ea5d69 service nova] Acquiring lock "refresh_cache-69031436-19d1-4cc1-91e7-4d99381b6ae3" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:51:11 user nova-compute[70954]: DEBUG nova.network.neutron [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Instance cache missing network info. {{(pid=70954) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 10:51:12 user nova-compute[70954]: DEBUG nova.network.neutron [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Updating instance_info_cache with network_info: [{"id": "516728bc-fbfa-4318-bbac-6c94509ee008", "address": "fa:16:3e:b8:cb:12", "network": {"id": "89cc600b-891d-4913-9f39-935f5c5bce86", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1522370995-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9ead44a7da0640cbb2cf8dece0ea4f40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap516728bc-fb", "ovs_interfaceid": "516728bc-fbfa-4318-bbac-6c94509ee008", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:51:12 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Releasing lock "refresh_cache-69031436-19d1-4cc1-91e7-4d99381b6ae3" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:51:12 user nova-compute[70954]: DEBUG nova.compute.manager [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Instance network_info: |[{"id": "516728bc-fbfa-4318-bbac-6c94509ee008", "address": "fa:16:3e:b8:cb:12", "network": {"id": "89cc600b-891d-4913-9f39-935f5c5bce86", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1522370995-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9ead44a7da0640cbb2cf8dece0ea4f40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap516728bc-fb", "ovs_interfaceid": "516728bc-fbfa-4318-bbac-6c94509ee008", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 10:51:12 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-2d5e49ba-3080-484a-bde9-020b497f0d6d req-23eacc69-f06e-429b-8c08-5e1739ea5d69 service nova] Acquired lock "refresh_cache-69031436-19d1-4cc1-91e7-4d99381b6ae3" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:51:12 user nova-compute[70954]: DEBUG nova.network.neutron [req-2d5e49ba-3080-484a-bde9-020b497f0d6d req-23eacc69-f06e-429b-8c08-5e1739ea5d69 service nova] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Refreshing network info cache for port 516728bc-fbfa-4318-bbac-6c94509ee008 {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 10:51:12 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Start _get_guest_xml network_info=[{"id": "516728bc-fbfa-4318-bbac-6c94509ee008", "address": "fa:16:3e:b8:cb:12", "network": {"id": "89cc600b-891d-4913-9f39-935f5c5bce86", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1522370995-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9ead44a7da0640cbb2cf8dece0ea4f40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap516728bc-fb", "ovs_interfaceid": "516728bc-fbfa-4318-bbac-6c94509ee008", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:51:07Z,direct_url=,disk_format='qcow2',id=86c32514-9140-48a4-8ce6-baeafcca9587,min_disk=0,min_ram=0,name='tempest-scenario-img--1132943129',owner='9ead44a7da0640cbb2cf8dece0ea4f40',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:51:08Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'size': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'disk_bus': 'virtio', 'encryption_format': None, 'guest_format': None, 'image_id': '86c32514-9140-48a4-8ce6-baeafcca9587'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 10:51:12 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:51:12 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:51:12 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70954) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 10:51:12 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T10:44:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:51:07Z,direct_url=,disk_format='qcow2',id=86c32514-9140-48a4-8ce6-baeafcca9587,min_disk=0,min_ram=0,name='tempest-scenario-img--1132943129',owner='9ead44a7da0640cbb2cf8dece0ea4f40',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:51:08Z,virtual_size=,visibility=), allow threads: True {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 10:51:12 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Flavor limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 10:51:12 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Image limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 10:51:12 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Flavor pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 10:51:12 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Image pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 10:51:12 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 10:51:12 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 10:51:12 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 10:51:12 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Got 1 possible topologies {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 10:51:12 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 10:51:12 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 10:51:12 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:51:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-2059401768',display_name='tempest-TestMinimumBasicScenario-server-2059401768',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-2059401768',id=13,image_ref='86c32514-9140-48a4-8ce6-baeafcca9587',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPdIcpjsQgofTWxPhSCIN/mpUwMbeXhtJNFymtTMxwmV7Bxki7QmQko9CAbLh0f7+BwLKjoTQlfvvfT0xvv8wlFZ5IIip5N+0UUo9CL93KfHE8COz3zk4cngjZQ1yxLNwA==',key_name='tempest-TestMinimumBasicScenario-990689924',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9ead44a7da0640cbb2cf8dece0ea4f40',ramdisk_id='',reservation_id='r-i4ys6mg0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86c32514-9140-48a4-8ce6-baeafcca9587',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-339882477',owner_user_name='tempest-TestMinimumBasicScenario-339882477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:51:10Z,user_data=None,user_id='6cae5a1734d24ac8aebc233dd31d3084',uuid=69031436-19d1-4cc1-91e7-4d99381b6ae3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "516728bc-fbfa-4318-bbac-6c94509ee008", "address": "fa:16:3e:b8:cb:12", "network": {"id": "89cc600b-891d-4913-9f39-935f5c5bce86", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1522370995-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9ead44a7da0640cbb2cf8dece0ea4f40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap516728bc-fb", "ovs_interfaceid": "516728bc-fbfa-4318-bbac-6c94509ee008", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70954) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 10:51:12 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Converting VIF {"id": "516728bc-fbfa-4318-bbac-6c94509ee008", "address": "fa:16:3e:b8:cb:12", "network": {"id": "89cc600b-891d-4913-9f39-935f5c5bce86", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1522370995-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9ead44a7da0640cbb2cf8dece0ea4f40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap516728bc-fb", "ovs_interfaceid": "516728bc-fbfa-4318-bbac-6c94509ee008", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:51:12 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b8:cb:12,bridge_name='br-int',has_traffic_filtering=True,id=516728bc-fbfa-4318-bbac-6c94509ee008,network=Network(89cc600b-891d-4913-9f39-935f5c5bce86),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap516728bc-fb') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:51:12 user nova-compute[70954]: DEBUG nova.objects.instance [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Lazy-loading 'pci_devices' on Instance uuid 69031436-19d1-4cc1-91e7-4d99381b6ae3 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:51:12 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] End _get_guest_xml xml= Apr 21 10:51:12 user nova-compute[70954]: 69031436-19d1-4cc1-91e7-4d99381b6ae3 Apr 21 10:51:12 user nova-compute[70954]: instance-0000000d Apr 21 10:51:12 user nova-compute[70954]: 131072 Apr 21 10:51:12 user nova-compute[70954]: 1 Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: tempest-TestMinimumBasicScenario-server-2059401768 Apr 21 10:51:12 user nova-compute[70954]: 2023-04-21 10:51:12 Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: 128 Apr 21 10:51:12 user nova-compute[70954]: 1 Apr 21 10:51:12 user nova-compute[70954]: 0 Apr 21 10:51:12 user nova-compute[70954]: 0 Apr 21 10:51:12 user nova-compute[70954]: 1 Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: tempest-TestMinimumBasicScenario-339882477-project-member Apr 21 10:51:12 user nova-compute[70954]: tempest-TestMinimumBasicScenario-339882477 Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: OpenStack Foundation Apr 21 10:51:12 user nova-compute[70954]: OpenStack Nova Apr 21 10:51:12 user nova-compute[70954]: 0.0.0 Apr 21 10:51:12 user nova-compute[70954]: 69031436-19d1-4cc1-91e7-4d99381b6ae3 Apr 21 10:51:12 user nova-compute[70954]: 69031436-19d1-4cc1-91e7-4d99381b6ae3 Apr 21 10:51:12 user nova-compute[70954]: Virtual Machine Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: hvm Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Nehalem Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: /dev/urandom Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: Apr 21 10:51:12 user nova-compute[70954]: {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 10:51:12 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:51:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-2059401768',display_name='tempest-TestMinimumBasicScenario-server-2059401768',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-2059401768',id=13,image_ref='86c32514-9140-48a4-8ce6-baeafcca9587',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPdIcpjsQgofTWxPhSCIN/mpUwMbeXhtJNFymtTMxwmV7Bxki7QmQko9CAbLh0f7+BwLKjoTQlfvvfT0xvv8wlFZ5IIip5N+0UUo9CL93KfHE8COz3zk4cngjZQ1yxLNwA==',key_name='tempest-TestMinimumBasicScenario-990689924',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9ead44a7da0640cbb2cf8dece0ea4f40',ramdisk_id='',reservation_id='r-i4ys6mg0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86c32514-9140-48a4-8ce6-baeafcca9587',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-339882477',owner_user_name='tempest-TestMinimumBasicScenario-339882477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:51:10Z,user_data=None,user_id='6cae5a1734d24ac8aebc233dd31d3084',uuid=69031436-19d1-4cc1-91e7-4d99381b6ae3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "516728bc-fbfa-4318-bbac-6c94509ee008", "address": "fa:16:3e:b8:cb:12", "network": {"id": "89cc600b-891d-4913-9f39-935f5c5bce86", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1522370995-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9ead44a7da0640cbb2cf8dece0ea4f40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap516728bc-fb", "ovs_interfaceid": "516728bc-fbfa-4318-bbac-6c94509ee008", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 10:51:12 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Converting VIF {"id": "516728bc-fbfa-4318-bbac-6c94509ee008", "address": "fa:16:3e:b8:cb:12", "network": {"id": "89cc600b-891d-4913-9f39-935f5c5bce86", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1522370995-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9ead44a7da0640cbb2cf8dece0ea4f40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap516728bc-fb", "ovs_interfaceid": "516728bc-fbfa-4318-bbac-6c94509ee008", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:51:12 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b8:cb:12,bridge_name='br-int',has_traffic_filtering=True,id=516728bc-fbfa-4318-bbac-6c94509ee008,network=Network(89cc600b-891d-4913-9f39-935f5c5bce86),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap516728bc-fb') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:51:12 user nova-compute[70954]: DEBUG os_vif [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:cb:12,bridge_name='br-int',has_traffic_filtering=True,id=516728bc-fbfa-4318-bbac-6c94509ee008,network=Network(89cc600b-891d-4913-9f39-935f5c5bce86),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap516728bc-fb') {{(pid=70954) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 10:51:12 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:12 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:51:12 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 10:51:12 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:12 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap516728bc-fb, may_exist=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:51:12 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap516728bc-fb, col_values=(('external_ids', {'iface-id': '516728bc-fbfa-4318-bbac-6c94509ee008', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b8:cb:12', 'vm-uuid': '69031436-19d1-4cc1-91e7-4d99381b6ae3'}),)) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:51:12 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:12 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:51:12 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:12 user nova-compute[70954]: INFO os_vif [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:cb:12,bridge_name='br-int',has_traffic_filtering=True,id=516728bc-fbfa-4318-bbac-6c94509ee008,network=Network(89cc600b-891d-4913-9f39-935f5c5bce86),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap516728bc-fb') Apr 21 10:51:12 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] No BDM found with device name vda, not building metadata. {{(pid=70954) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 10:51:12 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] No VIF found with MAC fa:16:3e:b8:cb:12, not building metadata {{(pid=70954) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 10:51:12 user nova-compute[70954]: DEBUG nova.network.neutron [req-2d5e49ba-3080-484a-bde9-020b497f0d6d req-23eacc69-f06e-429b-8c08-5e1739ea5d69 service nova] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Updated VIF entry in instance network info cache for port 516728bc-fbfa-4318-bbac-6c94509ee008. {{(pid=70954) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 10:51:12 user nova-compute[70954]: DEBUG nova.network.neutron [req-2d5e49ba-3080-484a-bde9-020b497f0d6d req-23eacc69-f06e-429b-8c08-5e1739ea5d69 service nova] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Updating instance_info_cache with network_info: [{"id": "516728bc-fbfa-4318-bbac-6c94509ee008", "address": "fa:16:3e:b8:cb:12", "network": {"id": "89cc600b-891d-4913-9f39-935f5c5bce86", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1522370995-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9ead44a7da0640cbb2cf8dece0ea4f40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap516728bc-fb", "ovs_interfaceid": "516728bc-fbfa-4318-bbac-6c94509ee008", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:51:12 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-2d5e49ba-3080-484a-bde9-020b497f0d6d req-23eacc69-f06e-429b-8c08-5e1739ea5d69 service nova] Releasing lock "refresh_cache-69031436-19d1-4cc1-91e7-4d99381b6ae3" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:51:12 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:13 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:13 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:13 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:13 user nova-compute[70954]: DEBUG nova.compute.manager [req-7777a554-1712-427f-af25-3027ea387409 req-0e920e91-0fd6-4bd9-84fb-a5da7e8ff21b service nova] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Received event network-vif-plugged-516728bc-fbfa-4318-bbac-6c94509ee008 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:51:13 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-7777a554-1712-427f-af25-3027ea387409 req-0e920e91-0fd6-4bd9-84fb-a5da7e8ff21b service nova] Acquiring lock "69031436-19d1-4cc1-91e7-4d99381b6ae3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:51:13 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-7777a554-1712-427f-af25-3027ea387409 req-0e920e91-0fd6-4bd9-84fb-a5da7e8ff21b service nova] Lock "69031436-19d1-4cc1-91e7-4d99381b6ae3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:51:13 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-7777a554-1712-427f-af25-3027ea387409 req-0e920e91-0fd6-4bd9-84fb-a5da7e8ff21b service nova] Lock "69031436-19d1-4cc1-91e7-4d99381b6ae3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:51:13 user nova-compute[70954]: DEBUG nova.compute.manager [req-7777a554-1712-427f-af25-3027ea387409 req-0e920e91-0fd6-4bd9-84fb-a5da7e8ff21b service nova] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] No waiting events found dispatching network-vif-plugged-516728bc-fbfa-4318-bbac-6c94509ee008 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:51:13 user nova-compute[70954]: WARNING nova.compute.manager [req-7777a554-1712-427f-af25-3027ea387409 req-0e920e91-0fd6-4bd9-84fb-a5da7e8ff21b service nova] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Received unexpected event network-vif-plugged-516728bc-fbfa-4318-bbac-6c94509ee008 for instance with vm_state building and task_state spawning. Apr 21 10:51:14 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:14 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:14 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:15 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Resumed> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:51:15 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] VM Resumed (Lifecycle Event) Apr 21 10:51:15 user nova-compute[70954]: DEBUG nova.compute.manager [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Instance event wait completed in 0 seconds for {{(pid=70954) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 10:51:15 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Guest created on hypervisor {{(pid=70954) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 10:51:15 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Instance spawned successfully. Apr 21 10:51:15 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 10:51:15 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:51:15 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:51:15 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Found default for hw_cdrom_bus of ide {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:51:15 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Found default for hw_disk_bus of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:51:15 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Found default for hw_input_bus of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:51:15 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Found default for hw_pointer_model of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:51:15 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Found default for hw_video_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:51:15 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Found default for hw_vif_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:51:15 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 10:51:15 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Started> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:51:15 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] VM Started (Lifecycle Event) Apr 21 10:51:15 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:51:15 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:51:15 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 10:51:15 user nova-compute[70954]: INFO nova.compute.manager [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Took 6.03 seconds to spawn the instance on the hypervisor. Apr 21 10:51:15 user nova-compute[70954]: DEBUG nova.compute.manager [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:51:16 user nova-compute[70954]: DEBUG nova.compute.manager [req-cb7ff0c8-672d-4f99-8da0-05bd839b7b4e req-ee6d9def-ffa5-4c0e-ac37-a0e5249d0f72 service nova] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Received event network-vif-plugged-516728bc-fbfa-4318-bbac-6c94509ee008 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:51:16 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-cb7ff0c8-672d-4f99-8da0-05bd839b7b4e req-ee6d9def-ffa5-4c0e-ac37-a0e5249d0f72 service nova] Acquiring lock "69031436-19d1-4cc1-91e7-4d99381b6ae3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:51:16 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-cb7ff0c8-672d-4f99-8da0-05bd839b7b4e req-ee6d9def-ffa5-4c0e-ac37-a0e5249d0f72 service nova] Lock "69031436-19d1-4cc1-91e7-4d99381b6ae3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:51:16 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-cb7ff0c8-672d-4f99-8da0-05bd839b7b4e req-ee6d9def-ffa5-4c0e-ac37-a0e5249d0f72 service nova] Lock "69031436-19d1-4cc1-91e7-4d99381b6ae3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:51:16 user nova-compute[70954]: DEBUG nova.compute.manager [req-cb7ff0c8-672d-4f99-8da0-05bd839b7b4e req-ee6d9def-ffa5-4c0e-ac37-a0e5249d0f72 service nova] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] No waiting events found dispatching network-vif-plugged-516728bc-fbfa-4318-bbac-6c94509ee008 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:51:16 user nova-compute[70954]: WARNING nova.compute.manager [req-cb7ff0c8-672d-4f99-8da0-05bd839b7b4e req-ee6d9def-ffa5-4c0e-ac37-a0e5249d0f72 service nova] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Received unexpected event network-vif-plugged-516728bc-fbfa-4318-bbac-6c94509ee008 for instance with vm_state building and task_state spawning. Apr 21 10:51:16 user nova-compute[70954]: INFO nova.compute.manager [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Took 6.67 seconds to build instance. Apr 21 10:51:16 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f1fc51a9-2224-426d-a626-3f57ffb84b04 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Lock "69031436-19d1-4cc1-91e7-4d99381b6ae3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.788s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:51:17 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:17 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:19 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d8fe6718-b596-43cc-9e62-87ca8cbaa249 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Acquiring lock "3dd95a26-8652-40f5-b357-3cbc8a38628a" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:51:19 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d8fe6718-b596-43cc-9e62-87ca8cbaa249 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Lock "3dd95a26-8652-40f5-b357-3cbc8a38628a" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:51:19 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d8fe6718-b596-43cc-9e62-87ca8cbaa249 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Acquiring lock "3dd95a26-8652-40f5-b357-3cbc8a38628a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:51:19 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d8fe6718-b596-43cc-9e62-87ca8cbaa249 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Lock "3dd95a26-8652-40f5-b357-3cbc8a38628a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:51:19 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d8fe6718-b596-43cc-9e62-87ca8cbaa249 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Lock "3dd95a26-8652-40f5-b357-3cbc8a38628a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:51:19 user nova-compute[70954]: INFO nova.compute.manager [None req-d8fe6718-b596-43cc-9e62-87ca8cbaa249 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Terminating instance Apr 21 10:51:19 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d8fe6718-b596-43cc-9e62-87ca8cbaa249 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Start destroying the instance on the hypervisor. {{(pid=70954) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 10:51:19 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:19 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:19 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:19 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:19 user nova-compute[70954]: DEBUG nova.compute.manager [req-97cb3582-f730-4a89-a0bf-9546f7156f9e req-4be49072-ffcb-4d62-b574-ea802f8b8bce service nova] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Received event network-vif-unplugged-76364ccf-028e-4291-b953-431265bcfabb {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:51:19 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-97cb3582-f730-4a89-a0bf-9546f7156f9e req-4be49072-ffcb-4d62-b574-ea802f8b8bce service nova] Acquiring lock "3dd95a26-8652-40f5-b357-3cbc8a38628a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:51:19 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-97cb3582-f730-4a89-a0bf-9546f7156f9e req-4be49072-ffcb-4d62-b574-ea802f8b8bce service nova] Lock "3dd95a26-8652-40f5-b357-3cbc8a38628a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:51:19 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-97cb3582-f730-4a89-a0bf-9546f7156f9e req-4be49072-ffcb-4d62-b574-ea802f8b8bce service nova] Lock "3dd95a26-8652-40f5-b357-3cbc8a38628a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:51:19 user nova-compute[70954]: DEBUG nova.compute.manager [req-97cb3582-f730-4a89-a0bf-9546f7156f9e req-4be49072-ffcb-4d62-b574-ea802f8b8bce service nova] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] No waiting events found dispatching network-vif-unplugged-76364ccf-028e-4291-b953-431265bcfabb {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:51:19 user nova-compute[70954]: DEBUG nova.compute.manager [req-97cb3582-f730-4a89-a0bf-9546f7156f9e req-4be49072-ffcb-4d62-b574-ea802f8b8bce service nova] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Received event network-vif-unplugged-76364ccf-028e-4291-b953-431265bcfabb for instance with task_state deleting. {{(pid=70954) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 10:51:19 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Instance destroyed successfully. Apr 21 10:51:19 user nova-compute[70954]: DEBUG nova.objects.instance [None req-d8fe6718-b596-43cc-9e62-87ca8cbaa249 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Lazy-loading 'resources' on Instance uuid 3dd95a26-8652-40f5-b357-3cbc8a38628a {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:51:19 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-d8fe6718-b596-43cc-9e62-87ca8cbaa249 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:49:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-1890729888',display_name='tempest-VolumesAdminNegativeTest-server-1890729888',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-1890729888',id=9,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-21T10:49:33Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='94e77e1735854e0c966c42e9a613017f',ramdisk_id='',reservation_id='r-bg3jhvup',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-VolumesAdminNegativeTest-243340095',owner_user_name='tempest-VolumesAdminNegativeTest-243340095-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T10:49:34Z,user_data=None,user_id='c600e01acfe140cabcdfe54958e66108',uuid=3dd95a26-8652-40f5-b357-3cbc8a38628a,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "76364ccf-028e-4291-b953-431265bcfabb", "address": "fa:16:3e:10:ff:81", "network": {"id": "fcf7861e-296e-4706-871b-557b594e17c3", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-610768075-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "94e77e1735854e0c966c42e9a613017f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap76364ccf-02", "ovs_interfaceid": "76364ccf-028e-4291-b953-431265bcfabb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 10:51:19 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-d8fe6718-b596-43cc-9e62-87ca8cbaa249 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Converting VIF {"id": "76364ccf-028e-4291-b953-431265bcfabb", "address": "fa:16:3e:10:ff:81", "network": {"id": "fcf7861e-296e-4706-871b-557b594e17c3", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-610768075-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "94e77e1735854e0c966c42e9a613017f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap76364ccf-02", "ovs_interfaceid": "76364ccf-028e-4291-b953-431265bcfabb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:51:19 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-d8fe6718-b596-43cc-9e62-87ca8cbaa249 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:10:ff:81,bridge_name='br-int',has_traffic_filtering=True,id=76364ccf-028e-4291-b953-431265bcfabb,network=Network(fcf7861e-296e-4706-871b-557b594e17c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap76364ccf-02') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:51:19 user nova-compute[70954]: DEBUG os_vif [None req-d8fe6718-b596-43cc-9e62-87ca8cbaa249 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:ff:81,bridge_name='br-int',has_traffic_filtering=True,id=76364ccf-028e-4291-b953-431265bcfabb,network=Network(fcf7861e-296e-4706-871b-557b594e17c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap76364ccf-02') {{(pid=70954) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 10:51:19 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:19 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap76364ccf-02, bridge=br-int, if_exists=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:51:19 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:19 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:51:19 user nova-compute[70954]: INFO os_vif [None req-d8fe6718-b596-43cc-9e62-87ca8cbaa249 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:ff:81,bridge_name='br-int',has_traffic_filtering=True,id=76364ccf-028e-4291-b953-431265bcfabb,network=Network(fcf7861e-296e-4706-871b-557b594e17c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap76364ccf-02') Apr 21 10:51:19 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-d8fe6718-b596-43cc-9e62-87ca8cbaa249 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Deleting instance files /opt/stack/data/nova/instances/3dd95a26-8652-40f5-b357-3cbc8a38628a_del Apr 21 10:51:19 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-d8fe6718-b596-43cc-9e62-87ca8cbaa249 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Deletion of /opt/stack/data/nova/instances/3dd95a26-8652-40f5-b357-3cbc8a38628a_del complete Apr 21 10:51:19 user nova-compute[70954]: INFO nova.compute.manager [None req-d8fe6718-b596-43cc-9e62-87ca8cbaa249 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Took 0.66 seconds to destroy the instance on the hypervisor. Apr 21 10:51:19 user nova-compute[70954]: DEBUG oslo.service.loopingcall [None req-d8fe6718-b596-43cc-9e62-87ca8cbaa249 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70954) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 10:51:19 user nova-compute[70954]: DEBUG nova.compute.manager [-] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Deallocating network for instance {{(pid=70954) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 10:51:19 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] deallocate_for_instance() {{(pid=70954) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 10:51:20 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Updating instance_info_cache with network_info: [] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:51:20 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Took 0.52 seconds to deallocate network for instance. Apr 21 10:51:20 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d8fe6718-b596-43cc-9e62-87ca8cbaa249 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:51:20 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d8fe6718-b596-43cc-9e62-87ca8cbaa249 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:51:20 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-d8fe6718-b596-43cc-9e62-87ca8cbaa249 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:51:20 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d8fe6718-b596-43cc-9e62-87ca8cbaa249 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:51:20 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d8fe6718-b596-43cc-9e62-87ca8cbaa249 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.244s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:51:20 user nova-compute[70954]: INFO nova.scheduler.client.report [None req-d8fe6718-b596-43cc-9e62-87ca8cbaa249 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Deleted allocations for instance 3dd95a26-8652-40f5-b357-3cbc8a38628a Apr 21 10:51:20 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d8fe6718-b596-43cc-9e62-87ca8cbaa249 tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Lock "3dd95a26-8652-40f5-b357-3cbc8a38628a" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.596s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:51:21 user nova-compute[70954]: DEBUG nova.compute.manager [req-7f6c3b21-1e20-4e99-be21-9676f6098ee0 req-99eb1c7e-db16-4846-994b-afbafc61b497 service nova] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Received event network-vif-plugged-76364ccf-028e-4291-b953-431265bcfabb {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:51:21 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-7f6c3b21-1e20-4e99-be21-9676f6098ee0 req-99eb1c7e-db16-4846-994b-afbafc61b497 service nova] Acquiring lock "3dd95a26-8652-40f5-b357-3cbc8a38628a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:51:21 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-7f6c3b21-1e20-4e99-be21-9676f6098ee0 req-99eb1c7e-db16-4846-994b-afbafc61b497 service nova] Lock "3dd95a26-8652-40f5-b357-3cbc8a38628a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:51:21 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-7f6c3b21-1e20-4e99-be21-9676f6098ee0 req-99eb1c7e-db16-4846-994b-afbafc61b497 service nova] Lock "3dd95a26-8652-40f5-b357-3cbc8a38628a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:51:21 user nova-compute[70954]: DEBUG nova.compute.manager [req-7f6c3b21-1e20-4e99-be21-9676f6098ee0 req-99eb1c7e-db16-4846-994b-afbafc61b497 service nova] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] No waiting events found dispatching network-vif-plugged-76364ccf-028e-4291-b953-431265bcfabb {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:51:21 user nova-compute[70954]: WARNING nova.compute.manager [req-7f6c3b21-1e20-4e99-be21-9676f6098ee0 req-99eb1c7e-db16-4846-994b-afbafc61b497 service nova] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Received unexpected event network-vif-plugged-76364ccf-028e-4291-b953-431265bcfabb for instance with vm_state deleted and task_state None. Apr 21 10:51:21 user nova-compute[70954]: DEBUG nova.compute.manager [req-7f6c3b21-1e20-4e99-be21-9676f6098ee0 req-99eb1c7e-db16-4846-994b-afbafc61b497 service nova] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Received event network-vif-deleted-76364ccf-028e-4291-b953-431265bcfabb {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:51:21 user nova-compute[70954]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:51:21 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] VM Stopped (Lifecycle Event) Apr 21 10:51:21 user nova-compute[70954]: DEBUG nova.compute.manager [None req-f18c219d-c078-4bc3-b27f-d11fddff5040 None None] [instance: 15bf9321-a92e-4be2-bcae-a943988c811a] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:51:22 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:24 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:27 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:29 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:32 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:34 user nova-compute[70954]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:51:34 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] VM Stopped (Lifecycle Event) Apr 21 10:51:34 user nova-compute[70954]: DEBUG nova.compute.manager [None req-c2263264-8d55-4f44-b4e8-fae8e8c8a780 None None] [instance: 3dd95a26-8652-40f5-b357-3cbc8a38628a] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:51:34 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:37 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:39 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:42 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:44 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:47 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:49 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:54 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:51:58 user nova-compute[70954]: DEBUG nova.compute.manager [req-d9653557-6983-462d-8ce7-a655d9949ca2 req-fedebd5a-12a6-438a-9a43-36c6f6ea4f95 service nova] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Received event network-changed-69a7ef96-ead5-4890-a014-d86e90fa5050 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:51:58 user nova-compute[70954]: DEBUG nova.compute.manager [req-d9653557-6983-462d-8ce7-a655d9949ca2 req-fedebd5a-12a6-438a-9a43-36c6f6ea4f95 service nova] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Refreshing instance network info cache due to event network-changed-69a7ef96-ead5-4890-a014-d86e90fa5050. {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 10:51:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-d9653557-6983-462d-8ce7-a655d9949ca2 req-fedebd5a-12a6-438a-9a43-36c6f6ea4f95 service nova] Acquiring lock "refresh_cache-566040e7-8140-467b-b814-8d7eb62ef735" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:51:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-d9653557-6983-462d-8ce7-a655d9949ca2 req-fedebd5a-12a6-438a-9a43-36c6f6ea4f95 service nova] Acquired lock "refresh_cache-566040e7-8140-467b-b814-8d7eb62ef735" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:51:58 user nova-compute[70954]: DEBUG nova.network.neutron [req-d9653557-6983-462d-8ce7-a655d9949ca2 req-fedebd5a-12a6-438a-9a43-36c6f6ea4f95 service nova] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Refreshing network info cache for port 69a7ef96-ead5-4890-a014-d86e90fa5050 {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 10:51:59 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:59 user nova-compute[70954]: DEBUG nova.network.neutron [req-d9653557-6983-462d-8ce7-a655d9949ca2 req-fedebd5a-12a6-438a-9a43-36c6f6ea4f95 service nova] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Updated VIF entry in instance network info cache for port 69a7ef96-ead5-4890-a014-d86e90fa5050. {{(pid=70954) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 10:51:59 user nova-compute[70954]: DEBUG nova.network.neutron [req-d9653557-6983-462d-8ce7-a655d9949ca2 req-fedebd5a-12a6-438a-9a43-36c6f6ea4f95 service nova] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Updating instance_info_cache with network_info: [{"id": "69a7ef96-ead5-4890-a014-d86e90fa5050", "address": "fa:16:3e:00:00:b3", "network": {"id": "b24b52ac-b8ab-493e-994c-c38d7c5c7089", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1354809025-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.29", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d85f51547e5244e495343281725fe320", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap69a7ef96-ea", "ovs_interfaceid": "69a7ef96-ead5-4890-a014-d86e90fa5050", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:51:59 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-d9653557-6983-462d-8ce7-a655d9949ca2 req-fedebd5a-12a6-438a-9a43-36c6f6ea4f95 service nova] Releasing lock "refresh_cache-566040e7-8140-467b-b814-8d7eb62ef735" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:51:59 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:59 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:51:59 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:52:00 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-a9121c5e-edc0-4d82-8ca3-776b8063a533 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Acquiring lock "566040e7-8140-467b-b814-8d7eb62ef735" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:52:00 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-a9121c5e-edc0-4d82-8ca3-776b8063a533 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Lock "566040e7-8140-467b-b814-8d7eb62ef735" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:52:00 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-a9121c5e-edc0-4d82-8ca3-776b8063a533 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Acquiring lock "566040e7-8140-467b-b814-8d7eb62ef735-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:52:00 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-a9121c5e-edc0-4d82-8ca3-776b8063a533 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Lock "566040e7-8140-467b-b814-8d7eb62ef735-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:52:00 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-a9121c5e-edc0-4d82-8ca3-776b8063a533 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Lock "566040e7-8140-467b-b814-8d7eb62ef735-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:52:00 user nova-compute[70954]: INFO nova.compute.manager [None req-a9121c5e-edc0-4d82-8ca3-776b8063a533 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Terminating instance Apr 21 10:52:00 user nova-compute[70954]: DEBUG nova.compute.manager [None req-a9121c5e-edc0-4d82-8ca3-776b8063a533 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Start destroying the instance on the hypervisor. {{(pid=70954) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 10:52:00 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:52:00 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:52:00 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:52:00 user nova-compute[70954]: DEBUG nova.compute.manager [req-620b215c-ffe9-4411-a30f-3134a16a9fc1 req-bcc81eba-e070-4fbc-85d2-e5eef0b1ff75 service nova] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Received event network-vif-unplugged-69a7ef96-ead5-4890-a014-d86e90fa5050 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:52:00 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-620b215c-ffe9-4411-a30f-3134a16a9fc1 req-bcc81eba-e070-4fbc-85d2-e5eef0b1ff75 service nova] Acquiring lock "566040e7-8140-467b-b814-8d7eb62ef735-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:52:00 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-620b215c-ffe9-4411-a30f-3134a16a9fc1 req-bcc81eba-e070-4fbc-85d2-e5eef0b1ff75 service nova] Lock "566040e7-8140-467b-b814-8d7eb62ef735-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:52:00 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-620b215c-ffe9-4411-a30f-3134a16a9fc1 req-bcc81eba-e070-4fbc-85d2-e5eef0b1ff75 service nova] Lock "566040e7-8140-467b-b814-8d7eb62ef735-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:52:00 user nova-compute[70954]: DEBUG nova.compute.manager [req-620b215c-ffe9-4411-a30f-3134a16a9fc1 req-bcc81eba-e070-4fbc-85d2-e5eef0b1ff75 service nova] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] No waiting events found dispatching network-vif-unplugged-69a7ef96-ead5-4890-a014-d86e90fa5050 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:52:00 user nova-compute[70954]: DEBUG nova.compute.manager [req-620b215c-ffe9-4411-a30f-3134a16a9fc1 req-bcc81eba-e070-4fbc-85d2-e5eef0b1ff75 service nova] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Received event network-vif-unplugged-69a7ef96-ead5-4890-a014-d86e90fa5050 for instance with task_state deleting. {{(pid=70954) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 10:52:00 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:52:00 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Starting heal instance info cache {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 10:52:00 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Instance destroyed successfully. Apr 21 10:52:00 user nova-compute[70954]: DEBUG nova.objects.instance [None req-a9121c5e-edc0-4d82-8ca3-776b8063a533 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Lazy-loading 'resources' on Instance uuid 566040e7-8140-467b-b814-8d7eb62ef735 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:52:00 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-a9121c5e-edc0-4d82-8ca3-776b8063a533 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:50:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-778341132',display_name='tempest-AttachVolumeTestJSON-server-778341132',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-778341132',id=11,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH4Me401imS8FOFYXCWEtUHYjBl+nNDFDOEp9F0qU3EcEjNmrofO3LBhufyQq8+T19fUEnsB1kP8hSrvs1kB/Y0kxUe6+elfuW9GrNxUHrtfrboj4/KWC2DfC017u1bvqA==',key_name='tempest-keypair-950390145',keypairs=,launch_index=0,launched_at=2023-04-21T10:50:13Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='d85f51547e5244e495343281725fe320',ramdisk_id='',reservation_id='r-jjr705pr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeTestJSON-2130575493',owner_user_name='tempest-AttachVolumeTestJSON-2130575493-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T10:50:13Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='25fb0d890b594080bb1bb99dd6294ff1',uuid=566040e7-8140-467b-b814-8d7eb62ef735,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "69a7ef96-ead5-4890-a014-d86e90fa5050", "address": "fa:16:3e:00:00:b3", "network": {"id": "b24b52ac-b8ab-493e-994c-c38d7c5c7089", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1354809025-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.29", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d85f51547e5244e495343281725fe320", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap69a7ef96-ea", "ovs_interfaceid": "69a7ef96-ead5-4890-a014-d86e90fa5050", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 10:52:00 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-a9121c5e-edc0-4d82-8ca3-776b8063a533 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Converting VIF {"id": "69a7ef96-ead5-4890-a014-d86e90fa5050", "address": "fa:16:3e:00:00:b3", "network": {"id": "b24b52ac-b8ab-493e-994c-c38d7c5c7089", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1354809025-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.29", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d85f51547e5244e495343281725fe320", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap69a7ef96-ea", "ovs_interfaceid": "69a7ef96-ead5-4890-a014-d86e90fa5050", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:52:00 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-a9121c5e-edc0-4d82-8ca3-776b8063a533 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:00:00:b3,bridge_name='br-int',has_traffic_filtering=True,id=69a7ef96-ead5-4890-a014-d86e90fa5050,network=Network(b24b52ac-b8ab-493e-994c-c38d7c5c7089),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69a7ef96-ea') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:52:00 user nova-compute[70954]: DEBUG os_vif [None req-a9121c5e-edc0-4d82-8ca3-776b8063a533 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:00:00:b3,bridge_name='br-int',has_traffic_filtering=True,id=69a7ef96-ead5-4890-a014-d86e90fa5050,network=Network(b24b52ac-b8ab-493e-994c-c38d7c5c7089),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69a7ef96-ea') {{(pid=70954) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 10:52:00 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:52:00 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap69a7ef96-ea, bridge=br-int, if_exists=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:52:01 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:52:01 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:52:01 user nova-compute[70954]: INFO os_vif [None req-a9121c5e-edc0-4d82-8ca3-776b8063a533 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:00:00:b3,bridge_name='br-int',has_traffic_filtering=True,id=69a7ef96-ead5-4890-a014-d86e90fa5050,network=Network(b24b52ac-b8ab-493e-994c-c38d7c5c7089),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69a7ef96-ea') Apr 21 10:52:01 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-a9121c5e-edc0-4d82-8ca3-776b8063a533 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Deleting instance files /opt/stack/data/nova/instances/566040e7-8140-467b-b814-8d7eb62ef735_del Apr 21 10:52:01 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-a9121c5e-edc0-4d82-8ca3-776b8063a533 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Deletion of /opt/stack/data/nova/instances/566040e7-8140-467b-b814-8d7eb62ef735_del complete Apr 21 10:52:01 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "refresh_cache-8ae797bd-c587-43a3-b941-e6d6d6c74e51" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:52:01 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquired lock "refresh_cache-8ae797bd-c587-43a3-b941-e6d6d6c74e51" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:52:01 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Forcefully refreshing network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 21 10:52:01 user nova-compute[70954]: INFO nova.compute.manager [None req-a9121c5e-edc0-4d82-8ca3-776b8063a533 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Took 0.92 seconds to destroy the instance on the hypervisor. Apr 21 10:52:01 user nova-compute[70954]: DEBUG oslo.service.loopingcall [None req-a9121c5e-edc0-4d82-8ca3-776b8063a533 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70954) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 10:52:01 user nova-compute[70954]: DEBUG nova.compute.manager [-] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Deallocating network for instance {{(pid=70954) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 10:52:01 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] deallocate_for_instance() {{(pid=70954) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 10:52:02 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Updating instance_info_cache with network_info: [{"id": "44d4e2d5-0850-4b05-9d97-f3916611f340", "address": "fa:16:3e:d2:a2:e1", "network": {"id": "fcf7861e-296e-4706-871b-557b594e17c3", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-610768075-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "94e77e1735854e0c966c42e9a613017f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap44d4e2d5-08", "ovs_interfaceid": "44d4e2d5-0850-4b05-9d97-f3916611f340", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:52:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Releasing lock "refresh_cache-8ae797bd-c587-43a3-b941-e6d6d6c74e51" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:52:02 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Updated the network info_cache for instance {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 21 10:52:02 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:52:02 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:52:02 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Updating instance_info_cache with network_info: [] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:52:02 user nova-compute[70954]: DEBUG nova.compute.manager [req-fffdb6fe-5ef3-4e6b-8f0e-ffbf438572b5 req-286157a5-b142-462b-93ce-7366a985bdd5 service nova] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Received event network-vif-deleted-69a7ef96-ead5-4890-a014-d86e90fa5050 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:52:02 user nova-compute[70954]: INFO nova.compute.manager [req-fffdb6fe-5ef3-4e6b-8f0e-ffbf438572b5 req-286157a5-b142-462b-93ce-7366a985bdd5 service nova] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Neutron deleted interface 69a7ef96-ead5-4890-a014-d86e90fa5050; detaching it from the instance and deleting it from the info cache Apr 21 10:52:02 user nova-compute[70954]: DEBUG nova.network.neutron [req-fffdb6fe-5ef3-4e6b-8f0e-ffbf438572b5 req-286157a5-b142-462b-93ce-7366a985bdd5 service nova] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Updating instance_info_cache with network_info: [] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:52:02 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Took 1.35 seconds to deallocate network for instance. Apr 21 10:52:02 user nova-compute[70954]: DEBUG nova.compute.manager [req-fffdb6fe-5ef3-4e6b-8f0e-ffbf438572b5 req-286157a5-b142-462b-93ce-7366a985bdd5 service nova] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Detach interface failed, port_id=69a7ef96-ead5-4890-a014-d86e90fa5050, reason: Instance 566040e7-8140-467b-b814-8d7eb62ef735 could not be found. {{(pid=70954) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 21 10:52:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-a9121c5e-edc0-4d82-8ca3-776b8063a533 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:52:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-a9121c5e-edc0-4d82-8ca3-776b8063a533 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:52:02 user nova-compute[70954]: DEBUG nova.compute.manager [req-3c6ad9fe-5a26-41d2-9c7d-ab01e71527eb req-36cc3ecf-0383-4f53-a8f5-bdabc7ceac34 service nova] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Received event network-vif-plugged-69a7ef96-ead5-4890-a014-d86e90fa5050 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:52:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-3c6ad9fe-5a26-41d2-9c7d-ab01e71527eb req-36cc3ecf-0383-4f53-a8f5-bdabc7ceac34 service nova] Acquiring lock "566040e7-8140-467b-b814-8d7eb62ef735-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:52:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-3c6ad9fe-5a26-41d2-9c7d-ab01e71527eb req-36cc3ecf-0383-4f53-a8f5-bdabc7ceac34 service nova] Lock "566040e7-8140-467b-b814-8d7eb62ef735-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:52:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-3c6ad9fe-5a26-41d2-9c7d-ab01e71527eb req-36cc3ecf-0383-4f53-a8f5-bdabc7ceac34 service nova] Lock "566040e7-8140-467b-b814-8d7eb62ef735-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:52:02 user nova-compute[70954]: DEBUG nova.compute.manager [req-3c6ad9fe-5a26-41d2-9c7d-ab01e71527eb req-36cc3ecf-0383-4f53-a8f5-bdabc7ceac34 service nova] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] No waiting events found dispatching network-vif-plugged-69a7ef96-ead5-4890-a014-d86e90fa5050 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:52:02 user nova-compute[70954]: WARNING nova.compute.manager [req-3c6ad9fe-5a26-41d2-9c7d-ab01e71527eb req-36cc3ecf-0383-4f53-a8f5-bdabc7ceac34 service nova] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Received unexpected event network-vif-plugged-69a7ef96-ead5-4890-a014-d86e90fa5050 for instance with vm_state deleted and task_state None. Apr 21 10:52:02 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-a9121c5e-edc0-4d82-8ca3-776b8063a533 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:52:02 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-a9121c5e-edc0-4d82-8ca3-776b8063a533 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:52:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-a9121c5e-edc0-4d82-8ca3-776b8063a533 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.231s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:52:02 user nova-compute[70954]: INFO nova.scheduler.client.report [None req-a9121c5e-edc0-4d82-8ca3-776b8063a533 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Deleted allocations for instance 566040e7-8140-467b-b814-8d7eb62ef735 Apr 21 10:52:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-a9121c5e-edc0-4d82-8ca3-776b8063a533 tempest-AttachVolumeTestJSON-2130575493 tempest-AttachVolumeTestJSON-2130575493-project-member] Lock "566040e7-8140-467b-b814-8d7eb62ef735" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.665s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:52:02 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:52:02 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager.update_available_resource {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:52:02 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:52:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:52:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:52:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:52:02 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Auditing locally available compute resources for user (node: user) {{(pid=70954) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 10:52:03 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:52:03 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk --force-share --output=json" returned: 0 in 0.146s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:52:03 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:52:03 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:52:03 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:52:03 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk --force-share --output=json" returned: 0 in 0.147s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:52:03 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:52:03 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:52:03 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8ae797bd-c587-43a3-b941-e6d6d6c74e51/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:52:03 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8ae797bd-c587-43a3-b941-e6d6d6c74e51/disk --force-share --output=json" returned: 0 in 0.129s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:52:03 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8ae797bd-c587-43a3-b941-e6d6d6c74e51/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:52:03 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8ae797bd-c587-43a3-b941-e6d6d6c74e51/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:52:03 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/69031436-19d1-4cc1-91e7-4d99381b6ae3/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:52:04 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/69031436-19d1-4cc1-91e7-4d99381b6ae3/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:52:04 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/69031436-19d1-4cc1-91e7-4d99381b6ae3/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:52:04 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/69031436-19d1-4cc1-91e7-4d99381b6ae3/disk --force-share --output=json" returned: 0 in 0.153s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:52:04 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f4dda568-8f3b-40eb-aff3-64d3e759c310/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:52:04 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f4dda568-8f3b-40eb-aff3-64d3e759c310/disk --force-share --output=json" returned: 0 in 0.147s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:52:04 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f4dda568-8f3b-40eb-aff3-64d3e759c310/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:52:04 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f4dda568-8f3b-40eb-aff3-64d3e759c310/disk --force-share --output=json" returned: 0 in 0.144s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:52:04 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:52:04 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:52:04 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Hypervisor/Node resource view: name=user free_ram=8587MB free_disk=26.485450744628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70954) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 10:52:04 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:52:04 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:52:05 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 84b55fc0-e748-4c05-97ad-a6994c0487d2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:52:05 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance f8609da3-c26d-482a-bc03-017baf4bce22 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:52:05 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 8ae797bd-c587-43a3-b941-e6d6d6c74e51 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:52:05 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance f4dda568-8f3b-40eb-aff3-64d3e759c310 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:52:05 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 69031436-19d1-4cc1-91e7-4d99381b6ae3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:52:05 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Total usable vcpus: 12, total allocated vcpus: 5 {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 10:52:05 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Final resource view: name=user phys_ram=16023MB used_ram=1152MB phys_disk=40GB used_disk=5GB total_vcpus=12 used_vcpus=5 pci_stats=[] {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 10:52:05 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:52:05 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:52:05 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Compute_service record updated for user:user {{(pid=70954) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 10:52:05 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.315s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:52:06 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:52:06 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:52:06 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:52:07 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:52:09 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-35d6b6b8-1e2e-434a-8085-b53ca92f641f tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Acquiring lock "8ae797bd-c587-43a3-b941-e6d6d6c74e51" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:52:09 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-35d6b6b8-1e2e-434a-8085-b53ca92f641f tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Lock "8ae797bd-c587-43a3-b941-e6d6d6c74e51" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.002s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:52:09 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-35d6b6b8-1e2e-434a-8085-b53ca92f641f tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Acquiring lock "8ae797bd-c587-43a3-b941-e6d6d6c74e51-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:52:09 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-35d6b6b8-1e2e-434a-8085-b53ca92f641f tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Lock "8ae797bd-c587-43a3-b941-e6d6d6c74e51-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:52:09 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-35d6b6b8-1e2e-434a-8085-b53ca92f641f tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Lock "8ae797bd-c587-43a3-b941-e6d6d6c74e51-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:52:09 user nova-compute[70954]: INFO nova.compute.manager [None req-35d6b6b8-1e2e-434a-8085-b53ca92f641f tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Terminating instance Apr 21 10:52:09 user nova-compute[70954]: DEBUG nova.compute.manager [None req-35d6b6b8-1e2e-434a-8085-b53ca92f641f tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Start destroying the instance on the hypervisor. {{(pid=70954) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 10:52:09 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:52:09 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:52:09 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70954) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 10:52:09 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:52:09 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:52:10 user nova-compute[70954]: DEBUG nova.compute.manager [req-922b39b9-0f5e-4900-91af-95724bb537e6 req-1d9b86bc-04cb-441d-b6d0-d9dc028fe27c service nova] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Received event network-vif-unplugged-44d4e2d5-0850-4b05-9d97-f3916611f340 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:52:10 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-922b39b9-0f5e-4900-91af-95724bb537e6 req-1d9b86bc-04cb-441d-b6d0-d9dc028fe27c service nova] Acquiring lock "8ae797bd-c587-43a3-b941-e6d6d6c74e51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:52:10 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-922b39b9-0f5e-4900-91af-95724bb537e6 req-1d9b86bc-04cb-441d-b6d0-d9dc028fe27c service nova] Lock "8ae797bd-c587-43a3-b941-e6d6d6c74e51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:52:10 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-922b39b9-0f5e-4900-91af-95724bb537e6 req-1d9b86bc-04cb-441d-b6d0-d9dc028fe27c service nova] Lock "8ae797bd-c587-43a3-b941-e6d6d6c74e51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:52:10 user nova-compute[70954]: DEBUG nova.compute.manager [req-922b39b9-0f5e-4900-91af-95724bb537e6 req-1d9b86bc-04cb-441d-b6d0-d9dc028fe27c service nova] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] No waiting events found dispatching network-vif-unplugged-44d4e2d5-0850-4b05-9d97-f3916611f340 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:52:10 user nova-compute[70954]: DEBUG nova.compute.manager [req-922b39b9-0f5e-4900-91af-95724bb537e6 req-1d9b86bc-04cb-441d-b6d0-d9dc028fe27c service nova] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Received event network-vif-unplugged-44d4e2d5-0850-4b05-9d97-f3916611f340 for instance with task_state deleting. {{(pid=70954) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 10:52:10 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Instance destroyed successfully. Apr 21 10:52:10 user nova-compute[70954]: DEBUG nova.objects.instance [None req-35d6b6b8-1e2e-434a-8085-b53ca92f641f tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Lazy-loading 'resources' on Instance uuid 8ae797bd-c587-43a3-b941-e6d6d6c74e51 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:52:10 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-35d6b6b8-1e2e-434a-8085-b53ca92f641f tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:47:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-1501895670',display_name='tempest-VolumesAdminNegativeTest-server-1501895670',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-1501895670',id=7,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBzw1O+hOmmy5NgPW3bdNxqbqSrhNejvipkwcp0JQTVAJHNzFgSc6wLIdKA9lC+AU3ZJ2MAGprLUKfW+mBKTjT3fZH2AvICL2uFFTJNA7ynActmX3XPF5TRREc2oNq2DWg==',key_name='tempest-keypair-1275342497',keypairs=,launch_index=0,launched_at=2023-04-21T10:47:47Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='94e77e1735854e0c966c42e9a613017f',ramdisk_id='',reservation_id='r-gh0werb9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-VolumesAdminNegativeTest-243340095',owner_user_name='tempest-VolumesAdminNegativeTest-243340095-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T10:47:48Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c600e01acfe140cabcdfe54958e66108',uuid=8ae797bd-c587-43a3-b941-e6d6d6c74e51,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "44d4e2d5-0850-4b05-9d97-f3916611f340", "address": "fa:16:3e:d2:a2:e1", "network": {"id": "fcf7861e-296e-4706-871b-557b594e17c3", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-610768075-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "94e77e1735854e0c966c42e9a613017f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap44d4e2d5-08", "ovs_interfaceid": "44d4e2d5-0850-4b05-9d97-f3916611f340", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 10:52:10 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-35d6b6b8-1e2e-434a-8085-b53ca92f641f tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Converting VIF {"id": "44d4e2d5-0850-4b05-9d97-f3916611f340", "address": "fa:16:3e:d2:a2:e1", "network": {"id": "fcf7861e-296e-4706-871b-557b594e17c3", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-610768075-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "94e77e1735854e0c966c42e9a613017f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap44d4e2d5-08", "ovs_interfaceid": "44d4e2d5-0850-4b05-9d97-f3916611f340", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:52:10 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-35d6b6b8-1e2e-434a-8085-b53ca92f641f tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d2:a2:e1,bridge_name='br-int',has_traffic_filtering=True,id=44d4e2d5-0850-4b05-9d97-f3916611f340,network=Network(fcf7861e-296e-4706-871b-557b594e17c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44d4e2d5-08') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:52:10 user nova-compute[70954]: DEBUG os_vif [None req-35d6b6b8-1e2e-434a-8085-b53ca92f641f tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d2:a2:e1,bridge_name='br-int',has_traffic_filtering=True,id=44d4e2d5-0850-4b05-9d97-f3916611f340,network=Network(fcf7861e-296e-4706-871b-557b594e17c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44d4e2d5-08') {{(pid=70954) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 10:52:10 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:52:10 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap44d4e2d5-08, bridge=br-int, if_exists=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:52:10 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:52:10 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:52:10 user nova-compute[70954]: INFO os_vif [None req-35d6b6b8-1e2e-434a-8085-b53ca92f641f tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d2:a2:e1,bridge_name='br-int',has_traffic_filtering=True,id=44d4e2d5-0850-4b05-9d97-f3916611f340,network=Network(fcf7861e-296e-4706-871b-557b594e17c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44d4e2d5-08') Apr 21 10:52:10 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-35d6b6b8-1e2e-434a-8085-b53ca92f641f tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Deleting instance files /opt/stack/data/nova/instances/8ae797bd-c587-43a3-b941-e6d6d6c74e51_del Apr 21 10:52:10 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-35d6b6b8-1e2e-434a-8085-b53ca92f641f tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Deletion of /opt/stack/data/nova/instances/8ae797bd-c587-43a3-b941-e6d6d6c74e51_del complete Apr 21 10:52:10 user nova-compute[70954]: INFO nova.compute.manager [None req-35d6b6b8-1e2e-434a-8085-b53ca92f641f tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Took 0.86 seconds to destroy the instance on the hypervisor. Apr 21 10:52:10 user nova-compute[70954]: DEBUG oslo.service.loopingcall [None req-35d6b6b8-1e2e-434a-8085-b53ca92f641f tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70954) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 10:52:10 user nova-compute[70954]: DEBUG nova.compute.manager [-] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Deallocating network for instance {{(pid=70954) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 10:52:10 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] deallocate_for_instance() {{(pid=70954) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 10:52:11 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Updating instance_info_cache with network_info: [] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:52:11 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Took 0.98 seconds to deallocate network for instance. Apr 21 10:52:11 user nova-compute[70954]: DEBUG nova.compute.manager [req-f37d5d7d-3a66-4116-b98a-a6ae0dc0f90b req-71d918cf-89d2-4378-853b-1032e4b6aa03 service nova] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Received event network-vif-deleted-44d4e2d5-0850-4b05-9d97-f3916611f340 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:52:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-35d6b6b8-1e2e-434a-8085-b53ca92f641f tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:52:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-35d6b6b8-1e2e-434a-8085-b53ca92f641f tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:52:11 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-35d6b6b8-1e2e-434a-8085-b53ca92f641f tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:52:11 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-35d6b6b8-1e2e-434a-8085-b53ca92f641f tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:52:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-35d6b6b8-1e2e-434a-8085-b53ca92f641f tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.194s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:52:11 user nova-compute[70954]: INFO nova.scheduler.client.report [None req-35d6b6b8-1e2e-434a-8085-b53ca92f641f tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Deleted allocations for instance 8ae797bd-c587-43a3-b941-e6d6d6c74e51 Apr 21 10:52:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-35d6b6b8-1e2e-434a-8085-b53ca92f641f tempest-VolumesAdminNegativeTest-243340095 tempest-VolumesAdminNegativeTest-243340095-project-member] Lock "8ae797bd-c587-43a3-b941-e6d6d6c74e51" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.217s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:52:12 user nova-compute[70954]: DEBUG nova.compute.manager [req-d45a66bd-574d-4234-bf87-ef2b24f7224c req-383d65a5-81f0-4351-864c-851917caa893 service nova] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Received event network-vif-plugged-44d4e2d5-0850-4b05-9d97-f3916611f340 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:52:12 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-d45a66bd-574d-4234-bf87-ef2b24f7224c req-383d65a5-81f0-4351-864c-851917caa893 service nova] Acquiring lock "8ae797bd-c587-43a3-b941-e6d6d6c74e51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:52:12 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-d45a66bd-574d-4234-bf87-ef2b24f7224c req-383d65a5-81f0-4351-864c-851917caa893 service nova] Lock "8ae797bd-c587-43a3-b941-e6d6d6c74e51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:52:12 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-d45a66bd-574d-4234-bf87-ef2b24f7224c req-383d65a5-81f0-4351-864c-851917caa893 service nova] Lock "8ae797bd-c587-43a3-b941-e6d6d6c74e51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:52:12 user nova-compute[70954]: DEBUG nova.compute.manager [req-d45a66bd-574d-4234-bf87-ef2b24f7224c req-383d65a5-81f0-4351-864c-851917caa893 service nova] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] No waiting events found dispatching network-vif-plugged-44d4e2d5-0850-4b05-9d97-f3916611f340 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:52:12 user nova-compute[70954]: WARNING nova.compute.manager [req-d45a66bd-574d-4234-bf87-ef2b24f7224c req-383d65a5-81f0-4351-864c-851917caa893 service nova] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Received unexpected event network-vif-plugged-44d4e2d5-0850-4b05-9d97-f3916611f340 for instance with vm_state deleted and task_state None. Apr 21 10:52:15 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:52:15 user nova-compute[70954]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:52:15 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] VM Stopped (Lifecycle Event) Apr 21 10:52:15 user nova-compute[70954]: DEBUG nova.compute.manager [None req-b878a841-6efc-4135-8cfb-87ef64a00544 None None] [instance: 566040e7-8140-467b-b814-8d7eb62ef735] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:52:17 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:52:20 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Acquiring lock "14bd5401-4cc1-4827-8d4a-fd1358bb9c6b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:52:20 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Lock "14bd5401-4cc1-4827-8d4a-fd1358bb9c6b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:52:20 user nova-compute[70954]: DEBUG nova.compute.manager [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Starting instance... {{(pid=70954) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 10:52:20 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:52:20 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:52:20 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70954) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 10:52:20 user nova-compute[70954]: INFO nova.compute.claims [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Claim successful on node user Apr 21 10:52:20 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:52:20 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:52:20 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:52:20 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.285s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:52:20 user nova-compute[70954]: DEBUG nova.compute.manager [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Start building networks asynchronously for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 10:52:20 user nova-compute[70954]: DEBUG nova.compute.manager [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Allocating IP information in the background. {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 10:52:20 user nova-compute[70954]: DEBUG nova.network.neutron [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] allocate_for_instance() {{(pid=70954) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 10:52:20 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 10:52:20 user nova-compute[70954]: DEBUG nova.compute.manager [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Start building block device mappings for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 10:52:20 user nova-compute[70954]: DEBUG nova.policy [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dc80162b39bc4ff2a71a8ff4d34979c3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8eaf2efa4ddc4d8fbc5ec14e86d93c53', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70954) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 10:52:20 user nova-compute[70954]: DEBUG nova.compute.manager [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Start spawning the instance on the hypervisor. {{(pid=70954) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 10:52:20 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Creating instance directory {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 10:52:20 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Creating image(s) Apr 21 10:52:20 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Acquiring lock "/opt/stack/data/nova/instances/14bd5401-4cc1-4827-8d4a-fd1358bb9c6b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:52:20 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Lock "/opt/stack/data/nova/instances/14bd5401-4cc1-4827-8d4a-fd1358bb9c6b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:52:20 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Lock "/opt/stack/data/nova/instances/14bd5401-4cc1-4827-8d4a-fd1358bb9c6b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:52:20 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:52:20 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.130s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:52:20 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Acquiring lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:52:20 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:52:20 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:52:21 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.132s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:52:21 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/14bd5401-4cc1-4827-8d4a-fd1358bb9c6b/disk 1073741824 {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:52:21 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/14bd5401-4cc1-4827-8d4a-fd1358bb9c6b/disk 1073741824" returned: 0 in 0.054s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:52:21 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.193s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:52:21 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:52:21 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.188s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:52:21 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Checking if we can resize image /opt/stack/data/nova/instances/14bd5401-4cc1-4827-8d4a-fd1358bb9c6b/disk. size=1073741824 {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 10:52:21 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/14bd5401-4cc1-4827-8d4a-fd1358bb9c6b/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:52:21 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/14bd5401-4cc1-4827-8d4a-fd1358bb9c6b/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:52:21 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Cannot resize image /opt/stack/data/nova/instances/14bd5401-4cc1-4827-8d4a-fd1358bb9c6b/disk to a smaller size. {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 10:52:21 user nova-compute[70954]: DEBUG nova.objects.instance [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Lazy-loading 'migration_context' on Instance uuid 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:52:21 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Created local disks {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 10:52:21 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Ensure instance console log exists: /opt/stack/data/nova/instances/14bd5401-4cc1-4827-8d4a-fd1358bb9c6b/console.log {{(pid=70954) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 10:52:21 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:52:21 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:52:21 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:52:21 user nova-compute[70954]: DEBUG nova.network.neutron [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Successfully created port: d8b0279f-95ca-4143-8f78-c6faf74a3620 {{(pid=70954) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG nova.network.neutron [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Successfully updated port: d8b0279f-95ca-4143-8f78-c6faf74a3620 {{(pid=70954) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Acquiring lock "refresh_cache-14bd5401-4cc1-4827-8d4a-fd1358bb9c6b" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Acquired lock "refresh_cache-14bd5401-4cc1-4827-8d4a-fd1358bb9c6b" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG nova.network.neutron [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Building network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG nova.network.neutron [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Instance cache missing network info. {{(pid=70954) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG nova.compute.manager [req-628a50aa-dbec-446e-a6cb-3177e51c3923 req-648cf59f-950b-4b95-8e00-624eff59a9aa service nova] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Received event network-changed-d8b0279f-95ca-4143-8f78-c6faf74a3620 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG nova.compute.manager [req-628a50aa-dbec-446e-a6cb-3177e51c3923 req-648cf59f-950b-4b95-8e00-624eff59a9aa service nova] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Refreshing instance network info cache due to event network-changed-d8b0279f-95ca-4143-8f78-c6faf74a3620. {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-628a50aa-dbec-446e-a6cb-3177e51c3923 req-648cf59f-950b-4b95-8e00-624eff59a9aa service nova] Acquiring lock "refresh_cache-14bd5401-4cc1-4827-8d4a-fd1358bb9c6b" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG nova.network.neutron [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Updating instance_info_cache with network_info: [{"id": "d8b0279f-95ca-4143-8f78-c6faf74a3620", "address": "fa:16:3e:63:4e:67", "network": {"id": "3fe4dad8-0e5a-4737-9c24-dc77d7b275ff", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1133574237-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "8eaf2efa4ddc4d8fbc5ec14e86d93c53", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8b0279f-95", "ovs_interfaceid": "d8b0279f-95ca-4143-8f78-c6faf74a3620", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Releasing lock "refresh_cache-14bd5401-4cc1-4827-8d4a-fd1358bb9c6b" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG nova.compute.manager [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Instance network_info: |[{"id": "d8b0279f-95ca-4143-8f78-c6faf74a3620", "address": "fa:16:3e:63:4e:67", "network": {"id": "3fe4dad8-0e5a-4737-9c24-dc77d7b275ff", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1133574237-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "8eaf2efa4ddc4d8fbc5ec14e86d93c53", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8b0279f-95", "ovs_interfaceid": "d8b0279f-95ca-4143-8f78-c6faf74a3620", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-628a50aa-dbec-446e-a6cb-3177e51c3923 req-648cf59f-950b-4b95-8e00-624eff59a9aa service nova] Acquired lock "refresh_cache-14bd5401-4cc1-4827-8d4a-fd1358bb9c6b" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG nova.network.neutron [req-628a50aa-dbec-446e-a6cb-3177e51c3923 req-648cf59f-950b-4b95-8e00-624eff59a9aa service nova] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Refreshing network info cache for port d8b0279f-95ca-4143-8f78-c6faf74a3620 {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Start _get_guest_xml network_info=[{"id": "d8b0279f-95ca-4143-8f78-c6faf74a3620", "address": "fa:16:3e:63:4e:67", "network": {"id": "3fe4dad8-0e5a-4737-9c24-dc77d7b275ff", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1133574237-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "8eaf2efa4ddc4d8fbc5ec14e86d93c53", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8b0279f-95", "ovs_interfaceid": "d8b0279f-95ca-4143-8f78-c6faf74a3620", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:43:25Z,direct_url=,disk_format='qcow2',id=3b29a01a-1fc0-4d0d-89fb-23d22b2de02e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='a3109aa78f014d0da3638064a889676d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:43:26Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'size': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'disk_bus': 'virtio', 'encryption_format': None, 'guest_format': None, 'image_id': '3b29a01a-1fc0-4d0d-89fb-23d22b2de02e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 10:52:22 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:52:22 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:52:22 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70954) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T10:44:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:43:25Z,direct_url=,disk_format='qcow2',id=3b29a01a-1fc0-4d0d-89fb-23d22b2de02e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='a3109aa78f014d0da3638064a889676d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:43:26Z,virtual_size=,visibility=), allow threads: True {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Flavor limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Image limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Flavor pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Image pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Got 1 possible topologies {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:52:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-1210829464',display_name='tempest-VolumesActionsTest-instance-1210829464',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesactionstest-instance-1210829464',id=14,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8eaf2efa4ddc4d8fbc5ec14e86d93c53',ramdisk_id='',reservation_id='r-z7mh3xce',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesActionsTest-301211957',owner_user_name='tempest-VolumesActionsTest-301211957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:52:21Z,user_data=None,user_id='dc80162b39bc4ff2a71a8ff4d34979c3',uuid=14bd5401-4cc1-4827-8d4a-fd1358bb9c6b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d8b0279f-95ca-4143-8f78-c6faf74a3620", "address": "fa:16:3e:63:4e:67", "network": {"id": "3fe4dad8-0e5a-4737-9c24-dc77d7b275ff", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1133574237-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "8eaf2efa4ddc4d8fbc5ec14e86d93c53", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8b0279f-95", "ovs_interfaceid": "d8b0279f-95ca-4143-8f78-c6faf74a3620", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70954) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Converting VIF {"id": "d8b0279f-95ca-4143-8f78-c6faf74a3620", "address": "fa:16:3e:63:4e:67", "network": {"id": "3fe4dad8-0e5a-4737-9c24-dc77d7b275ff", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1133574237-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "8eaf2efa4ddc4d8fbc5ec14e86d93c53", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8b0279f-95", "ovs_interfaceid": "d8b0279f-95ca-4143-8f78-c6faf74a3620", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:63:4e:67,bridge_name='br-int',has_traffic_filtering=True,id=d8b0279f-95ca-4143-8f78-c6faf74a3620,network=Network(3fe4dad8-0e5a-4737-9c24-dc77d7b275ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8b0279f-95') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG nova.objects.instance [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Lazy-loading 'pci_devices' on Instance uuid 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] End _get_guest_xml xml= Apr 21 10:52:22 user nova-compute[70954]: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b Apr 21 10:52:22 user nova-compute[70954]: instance-0000000e Apr 21 10:52:22 user nova-compute[70954]: 131072 Apr 21 10:52:22 user nova-compute[70954]: 1 Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: tempest-VolumesActionsTest-instance-1210829464 Apr 21 10:52:22 user nova-compute[70954]: 2023-04-21 10:52:22 Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: 128 Apr 21 10:52:22 user nova-compute[70954]: 1 Apr 21 10:52:22 user nova-compute[70954]: 0 Apr 21 10:52:22 user nova-compute[70954]: 0 Apr 21 10:52:22 user nova-compute[70954]: 1 Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: tempest-VolumesActionsTest-301211957-project-member Apr 21 10:52:22 user nova-compute[70954]: tempest-VolumesActionsTest-301211957 Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: OpenStack Foundation Apr 21 10:52:22 user nova-compute[70954]: OpenStack Nova Apr 21 10:52:22 user nova-compute[70954]: 0.0.0 Apr 21 10:52:22 user nova-compute[70954]: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b Apr 21 10:52:22 user nova-compute[70954]: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b Apr 21 10:52:22 user nova-compute[70954]: Virtual Machine Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: hvm Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Nehalem Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: /dev/urandom Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: Apr 21 10:52:22 user nova-compute[70954]: {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:52:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-1210829464',display_name='tempest-VolumesActionsTest-instance-1210829464',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesactionstest-instance-1210829464',id=14,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8eaf2efa4ddc4d8fbc5ec14e86d93c53',ramdisk_id='',reservation_id='r-z7mh3xce',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesActionsTest-301211957',owner_user_name='tempest-VolumesActionsTest-301211957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:52:21Z,user_data=None,user_id='dc80162b39bc4ff2a71a8ff4d34979c3',uuid=14bd5401-4cc1-4827-8d4a-fd1358bb9c6b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d8b0279f-95ca-4143-8f78-c6faf74a3620", "address": "fa:16:3e:63:4e:67", "network": {"id": "3fe4dad8-0e5a-4737-9c24-dc77d7b275ff", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1133574237-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "8eaf2efa4ddc4d8fbc5ec14e86d93c53", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8b0279f-95", "ovs_interfaceid": "d8b0279f-95ca-4143-8f78-c6faf74a3620", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Converting VIF {"id": "d8b0279f-95ca-4143-8f78-c6faf74a3620", "address": "fa:16:3e:63:4e:67", "network": {"id": "3fe4dad8-0e5a-4737-9c24-dc77d7b275ff", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1133574237-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "8eaf2efa4ddc4d8fbc5ec14e86d93c53", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8b0279f-95", "ovs_interfaceid": "d8b0279f-95ca-4143-8f78-c6faf74a3620", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:63:4e:67,bridge_name='br-int',has_traffic_filtering=True,id=d8b0279f-95ca-4143-8f78-c6faf74a3620,network=Network(3fe4dad8-0e5a-4737-9c24-dc77d7b275ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8b0279f-95') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG os_vif [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:63:4e:67,bridge_name='br-int',has_traffic_filtering=True,id=d8b0279f-95ca-4143-8f78-c6faf74a3620,network=Network(3fe4dad8-0e5a-4737-9c24-dc77d7b275ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8b0279f-95') {{(pid=70954) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd8b0279f-95, may_exist=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd8b0279f-95, col_values=(('external_ids', {'iface-id': 'd8b0279f-95ca-4143-8f78-c6faf74a3620', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:63:4e:67', 'vm-uuid': '14bd5401-4cc1-4827-8d4a-fd1358bb9c6b'}),)) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:52:22 user nova-compute[70954]: INFO os_vif [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:63:4e:67,bridge_name='br-int',has_traffic_filtering=True,id=d8b0279f-95ca-4143-8f78-c6faf74a3620,network=Network(3fe4dad8-0e5a-4737-9c24-dc77d7b275ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8b0279f-95') Apr 21 10:52:22 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] No BDM found with device name vda, not building metadata. {{(pid=70954) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] No VIF found with MAC fa:16:3e:63:4e:67, not building metadata {{(pid=70954) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 10:52:22 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:52:23 user nova-compute[70954]: DEBUG nova.network.neutron [req-628a50aa-dbec-446e-a6cb-3177e51c3923 req-648cf59f-950b-4b95-8e00-624eff59a9aa service nova] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Updated VIF entry in instance network info cache for port d8b0279f-95ca-4143-8f78-c6faf74a3620. {{(pid=70954) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 10:52:23 user nova-compute[70954]: DEBUG nova.network.neutron [req-628a50aa-dbec-446e-a6cb-3177e51c3923 req-648cf59f-950b-4b95-8e00-624eff59a9aa service nova] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Updating instance_info_cache with network_info: [{"id": "d8b0279f-95ca-4143-8f78-c6faf74a3620", "address": "fa:16:3e:63:4e:67", "network": {"id": "3fe4dad8-0e5a-4737-9c24-dc77d7b275ff", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1133574237-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "8eaf2efa4ddc4d8fbc5ec14e86d93c53", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8b0279f-95", "ovs_interfaceid": "d8b0279f-95ca-4143-8f78-c6faf74a3620", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:52:23 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-628a50aa-dbec-446e-a6cb-3177e51c3923 req-648cf59f-950b-4b95-8e00-624eff59a9aa service nova] Releasing lock "refresh_cache-14bd5401-4cc1-4827-8d4a-fd1358bb9c6b" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:52:24 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:52:24 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:52:24 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:52:24 user nova-compute[70954]: DEBUG nova.compute.manager [req-fcaea8c4-3d8d-4b0c-a02d-c145127175b1 req-586ef544-9d4a-4317-9ce6-5e6365efe1e4 service nova] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Received event network-vif-plugged-d8b0279f-95ca-4143-8f78-c6faf74a3620 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:52:24 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-fcaea8c4-3d8d-4b0c-a02d-c145127175b1 req-586ef544-9d4a-4317-9ce6-5e6365efe1e4 service nova] Acquiring lock "14bd5401-4cc1-4827-8d4a-fd1358bb9c6b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:52:24 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-fcaea8c4-3d8d-4b0c-a02d-c145127175b1 req-586ef544-9d4a-4317-9ce6-5e6365efe1e4 service nova] Lock "14bd5401-4cc1-4827-8d4a-fd1358bb9c6b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:52:24 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-fcaea8c4-3d8d-4b0c-a02d-c145127175b1 req-586ef544-9d4a-4317-9ce6-5e6365efe1e4 service nova] Lock "14bd5401-4cc1-4827-8d4a-fd1358bb9c6b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:52:24 user nova-compute[70954]: DEBUG nova.compute.manager [req-fcaea8c4-3d8d-4b0c-a02d-c145127175b1 req-586ef544-9d4a-4317-9ce6-5e6365efe1e4 service nova] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] No waiting events found dispatching network-vif-plugged-d8b0279f-95ca-4143-8f78-c6faf74a3620 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:52:24 user nova-compute[70954]: WARNING nova.compute.manager [req-fcaea8c4-3d8d-4b0c-a02d-c145127175b1 req-586ef544-9d4a-4317-9ce6-5e6365efe1e4 service nova] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Received unexpected event network-vif-plugged-d8b0279f-95ca-4143-8f78-c6faf74a3620 for instance with vm_state building and task_state spawning. Apr 21 10:52:25 user nova-compute[70954]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:52:25 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] VM Stopped (Lifecycle Event) Apr 21 10:52:25 user nova-compute[70954]: DEBUG nova.compute.manager [None req-c92ec9d4-ba7f-455a-931e-7e7ff8d175e6 None None] [instance: 8ae797bd-c587-43a3-b941-e6d6d6c74e51] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:52:26 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Resumed> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:52:26 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] VM Resumed (Lifecycle Event) Apr 21 10:52:26 user nova-compute[70954]: DEBUG nova.compute.manager [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Instance event wait completed in 0 seconds for {{(pid=70954) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 10:52:26 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Guest created on hypervisor {{(pid=70954) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 10:52:26 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Instance spawned successfully. Apr 21 10:52:26 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 10:52:26 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:52:26 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:52:26 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Found default for hw_cdrom_bus of ide {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:52:26 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Found default for hw_disk_bus of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:52:26 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Found default for hw_input_bus of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:52:26 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Found default for hw_pointer_model of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:52:26 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Found default for hw_video_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:52:26 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Found default for hw_vif_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:52:26 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 10:52:26 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Started> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:52:26 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] VM Started (Lifecycle Event) Apr 21 10:52:26 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:52:26 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:52:26 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 10:52:26 user nova-compute[70954]: INFO nova.compute.manager [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Took 5.41 seconds to spawn the instance on the hypervisor. Apr 21 10:52:26 user nova-compute[70954]: DEBUG nova.compute.manager [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:52:26 user nova-compute[70954]: INFO nova.compute.manager [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Took 6.04 seconds to build instance. Apr 21 10:52:26 user nova-compute[70954]: DEBUG nova.compute.manager [req-b53f7d7f-eeff-440c-b142-d9495d99d5b8 req-a546f4d0-203a-4e9f-8d6c-c2c4309fb8b7 service nova] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Received event network-vif-plugged-d8b0279f-95ca-4143-8f78-c6faf74a3620 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:52:26 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-b53f7d7f-eeff-440c-b142-d9495d99d5b8 req-a546f4d0-203a-4e9f-8d6c-c2c4309fb8b7 service nova] Acquiring lock "14bd5401-4cc1-4827-8d4a-fd1358bb9c6b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:52:26 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-b53f7d7f-eeff-440c-b142-d9495d99d5b8 req-a546f4d0-203a-4e9f-8d6c-c2c4309fb8b7 service nova] Lock "14bd5401-4cc1-4827-8d4a-fd1358bb9c6b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:52:26 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-b53f7d7f-eeff-440c-b142-d9495d99d5b8 req-a546f4d0-203a-4e9f-8d6c-c2c4309fb8b7 service nova] Lock "14bd5401-4cc1-4827-8d4a-fd1358bb9c6b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:52:26 user nova-compute[70954]: DEBUG nova.compute.manager [req-b53f7d7f-eeff-440c-b142-d9495d99d5b8 req-a546f4d0-203a-4e9f-8d6c-c2c4309fb8b7 service nova] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] No waiting events found dispatching network-vif-plugged-d8b0279f-95ca-4143-8f78-c6faf74a3620 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:52:26 user nova-compute[70954]: WARNING nova.compute.manager [req-b53f7d7f-eeff-440c-b142-d9495d99d5b8 req-a546f4d0-203a-4e9f-8d6c-c2c4309fb8b7 service nova] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Received unexpected event network-vif-plugged-d8b0279f-95ca-4143-8f78-c6faf74a3620 for instance with vm_state active and task_state None. Apr 21 10:52:26 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-642e0c11-4300-46d5-a9ef-cef9a8da52dc tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Lock "14bd5401-4cc1-4827-8d4a-fd1358bb9c6b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.144s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:52:27 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:52:27 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:52:32 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:52:37 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:52:37 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:52:42 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:52:42 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:52:44 user nova-compute[70954]: DEBUG nova.compute.manager [None req-7731f480-1221-4163-8e53-756acadf6c46 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:52:44 user nova-compute[70954]: INFO nova.compute.manager [None req-7731f480-1221-4163-8e53-756acadf6c46 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] instance snapshotting Apr 21 10:52:44 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-7731f480-1221-4163-8e53-756acadf6c46 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Beginning live snapshot process Apr 21 10:52:44 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-7731f480-1221-4163-8e53-756acadf6c46 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f4dda568-8f3b-40eb-aff3-64d3e759c310/disk --force-share --output=json -f qcow2 {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:52:45 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-7731f480-1221-4163-8e53-756acadf6c46 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f4dda568-8f3b-40eb-aff3-64d3e759c310/disk --force-share --output=json -f qcow2" returned: 0 in 0.133s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:52:45 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-7731f480-1221-4163-8e53-756acadf6c46 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f4dda568-8f3b-40eb-aff3-64d3e759c310/disk --force-share --output=json -f qcow2 {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:52:45 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-7731f480-1221-4163-8e53-756acadf6c46 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f4dda568-8f3b-40eb-aff3-64d3e759c310/disk --force-share --output=json -f qcow2" returned: 0 in 0.144s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:52:45 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-7731f480-1221-4163-8e53-756acadf6c46 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:52:45 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-7731f480-1221-4163-8e53-756acadf6c46 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.140s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:52:45 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-7731f480-1221-4163-8e53-756acadf6c46 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmpnj6pwosf/efaebc648bf247c0b3ca864db7955ff7.delta 1073741824 {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:52:45 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-7731f480-1221-4163-8e53-756acadf6c46 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmpnj6pwosf/efaebc648bf247c0b3ca864db7955ff7.delta 1073741824" returned: 0 in 0.054s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:52:45 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-7731f480-1221-4163-8e53-756acadf6c46 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Quiescing instance not available: QEMU guest agent is not enabled. Apr 21 10:52:46 user nova-compute[70954]: DEBUG nova.virt.libvirt.guest [None req-7731f480-1221-4163-8e53-756acadf6c46 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] COPY block job progress, current cursor: 0 final cursor: 43778048 {{(pid=70954) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 21 10:52:46 user nova-compute[70954]: DEBUG nova.virt.libvirt.guest [None req-7731f480-1221-4163-8e53-756acadf6c46 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] COPY block job progress, current cursor: 43778048 final cursor: 43778048 {{(pid=70954) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 21 10:52:46 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-7731f480-1221-4163-8e53-756acadf6c46 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Skipping quiescing instance: QEMU guest agent is not enabled. Apr 21 10:52:46 user nova-compute[70954]: DEBUG nova.privsep.utils [None req-7731f480-1221-4163-8e53-756acadf6c46 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=70954) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 21 10:52:46 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-7731f480-1221-4163-8e53-756acadf6c46 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmpnj6pwosf/efaebc648bf247c0b3ca864db7955ff7.delta /opt/stack/data/nova/instances/snapshots/tmpnj6pwosf/efaebc648bf247c0b3ca864db7955ff7 {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:52:47 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-7731f480-1221-4163-8e53-756acadf6c46 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmpnj6pwosf/efaebc648bf247c0b3ca864db7955ff7.delta /opt/stack/data/nova/instances/snapshots/tmpnj6pwosf/efaebc648bf247c0b3ca864db7955ff7" returned: 0 in 0.654s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:52:47 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-7731f480-1221-4163-8e53-756acadf6c46 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Snapshot extracted, beginning image upload Apr 21 10:52:47 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:52:47 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:52:49 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-7731f480-1221-4163-8e53-756acadf6c46 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Snapshot image upload complete Apr 21 10:52:49 user nova-compute[70954]: INFO nova.compute.manager [None req-7731f480-1221-4163-8e53-756acadf6c46 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Took 4.77 seconds to snapshot the instance on the hypervisor. Apr 21 10:52:52 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:52:52 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:52:54 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:52:57 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:52:59 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:52:59 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:53:00 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:00 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:53:00 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Cleaning up deleted instances with incomplete migration {{(pid=70954) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 21 10:53:01 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:53:01 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Starting heal instance info cache {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 10:53:01 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Rebuilding the list of instances to heal {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 21 10:53:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ef4b6300-035d-42a2-8468-d856f091f737 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Acquiring lock "69031436-19d1-4cc1-91e7-4d99381b6ae3" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:53:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ef4b6300-035d-42a2-8468-d856f091f737 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Lock "69031436-19d1-4cc1-91e7-4d99381b6ae3" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:53:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ef4b6300-035d-42a2-8468-d856f091f737 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Acquiring lock "69031436-19d1-4cc1-91e7-4d99381b6ae3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:53:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ef4b6300-035d-42a2-8468-d856f091f737 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Lock "69031436-19d1-4cc1-91e7-4d99381b6ae3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:53:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ef4b6300-035d-42a2-8468-d856f091f737 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Lock "69031436-19d1-4cc1-91e7-4d99381b6ae3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:53:02 user nova-compute[70954]: INFO nova.compute.manager [None req-ef4b6300-035d-42a2-8468-d856f091f737 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Terminating instance Apr 21 10:53:02 user nova-compute[70954]: DEBUG nova.compute.manager [None req-ef4b6300-035d-42a2-8468-d856f091f737 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Start destroying the instance on the hypervisor. {{(pid=70954) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 10:53:02 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:02 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:02 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:02 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "refresh_cache-84b55fc0-e748-4c05-97ad-a6994c0487d2" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:53:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquired lock "refresh_cache-84b55fc0-e748-4c05-97ad-a6994c0487d2" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:53:02 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Forcefully refreshing network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 21 10:53:02 user nova-compute[70954]: DEBUG nova.objects.instance [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lazy-loading 'info_cache' on Instance uuid 84b55fc0-e748-4c05-97ad-a6994c0487d2 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:53:02 user nova-compute[70954]: DEBUG nova.compute.manager [req-cd6678b7-e402-4d77-a5ef-b0f5fa2d937c req-e1c95be9-ebc5-476c-b05c-fd21d7a29193 service nova] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Received event network-vif-unplugged-516728bc-fbfa-4318-bbac-6c94509ee008 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:53:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-cd6678b7-e402-4d77-a5ef-b0f5fa2d937c req-e1c95be9-ebc5-476c-b05c-fd21d7a29193 service nova] Acquiring lock "69031436-19d1-4cc1-91e7-4d99381b6ae3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:53:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-cd6678b7-e402-4d77-a5ef-b0f5fa2d937c req-e1c95be9-ebc5-476c-b05c-fd21d7a29193 service nova] Lock "69031436-19d1-4cc1-91e7-4d99381b6ae3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:53:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-cd6678b7-e402-4d77-a5ef-b0f5fa2d937c req-e1c95be9-ebc5-476c-b05c-fd21d7a29193 service nova] Lock "69031436-19d1-4cc1-91e7-4d99381b6ae3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:53:02 user nova-compute[70954]: DEBUG nova.compute.manager [req-cd6678b7-e402-4d77-a5ef-b0f5fa2d937c req-e1c95be9-ebc5-476c-b05c-fd21d7a29193 service nova] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] No waiting events found dispatching network-vif-unplugged-516728bc-fbfa-4318-bbac-6c94509ee008 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:53:02 user nova-compute[70954]: DEBUG nova.compute.manager [req-cd6678b7-e402-4d77-a5ef-b0f5fa2d937c req-e1c95be9-ebc5-476c-b05c-fd21d7a29193 service nova] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Received event network-vif-unplugged-516728bc-fbfa-4318-bbac-6c94509ee008 for instance with task_state deleting. {{(pid=70954) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 10:53:02 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:02 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:02 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Instance destroyed successfully. Apr 21 10:53:02 user nova-compute[70954]: DEBUG nova.objects.instance [None req-ef4b6300-035d-42a2-8468-d856f091f737 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Lazy-loading 'resources' on Instance uuid 69031436-19d1-4cc1-91e7-4d99381b6ae3 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:53:02 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-ef4b6300-035d-42a2-8468-d856f091f737 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:51:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-2059401768',display_name='tempest-TestMinimumBasicScenario-server-2059401768',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-2059401768',id=13,image_ref='86c32514-9140-48a4-8ce6-baeafcca9587',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPdIcpjsQgofTWxPhSCIN/mpUwMbeXhtJNFymtTMxwmV7Bxki7QmQko9CAbLh0f7+BwLKjoTQlfvvfT0xvv8wlFZ5IIip5N+0UUo9CL93KfHE8COz3zk4cngjZQ1yxLNwA==',key_name='tempest-TestMinimumBasicScenario-990689924',keypairs=,launch_index=0,launched_at=2023-04-21T10:51:15Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='9ead44a7da0640cbb2cf8dece0ea4f40',ramdisk_id='',reservation_id='r-i4ys6mg0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86c32514-9140-48a4-8ce6-baeafcca9587',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestMinimumBasicScenario-339882477',owner_user_name='tempest-TestMinimumBasicScenario-339882477-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T10:51:16Z,user_data=None,user_id='6cae5a1734d24ac8aebc233dd31d3084',uuid=69031436-19d1-4cc1-91e7-4d99381b6ae3,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "516728bc-fbfa-4318-bbac-6c94509ee008", "address": "fa:16:3e:b8:cb:12", "network": {"id": "89cc600b-891d-4913-9f39-935f5c5bce86", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1522370995-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9ead44a7da0640cbb2cf8dece0ea4f40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap516728bc-fb", "ovs_interfaceid": "516728bc-fbfa-4318-bbac-6c94509ee008", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 10:53:02 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-ef4b6300-035d-42a2-8468-d856f091f737 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Converting VIF {"id": "516728bc-fbfa-4318-bbac-6c94509ee008", "address": "fa:16:3e:b8:cb:12", "network": {"id": "89cc600b-891d-4913-9f39-935f5c5bce86", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1522370995-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9ead44a7da0640cbb2cf8dece0ea4f40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap516728bc-fb", "ovs_interfaceid": "516728bc-fbfa-4318-bbac-6c94509ee008", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:53:02 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-ef4b6300-035d-42a2-8468-d856f091f737 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b8:cb:12,bridge_name='br-int',has_traffic_filtering=True,id=516728bc-fbfa-4318-bbac-6c94509ee008,network=Network(89cc600b-891d-4913-9f39-935f5c5bce86),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap516728bc-fb') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:53:02 user nova-compute[70954]: DEBUG os_vif [None req-ef4b6300-035d-42a2-8468-d856f091f737 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:cb:12,bridge_name='br-int',has_traffic_filtering=True,id=516728bc-fbfa-4318-bbac-6c94509ee008,network=Network(89cc600b-891d-4913-9f39-935f5c5bce86),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap516728bc-fb') {{(pid=70954) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 10:53:02 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:02 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap516728bc-fb, bridge=br-int, if_exists=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:53:02 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:02 user nova-compute[70954]: INFO os_vif [None req-ef4b6300-035d-42a2-8468-d856f091f737 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:cb:12,bridge_name='br-int',has_traffic_filtering=True,id=516728bc-fbfa-4318-bbac-6c94509ee008,network=Network(89cc600b-891d-4913-9f39-935f5c5bce86),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap516728bc-fb') Apr 21 10:53:02 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-ef4b6300-035d-42a2-8468-d856f091f737 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Deleting instance files /opt/stack/data/nova/instances/69031436-19d1-4cc1-91e7-4d99381b6ae3_del Apr 21 10:53:02 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-ef4b6300-035d-42a2-8468-d856f091f737 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Deletion of /opt/stack/data/nova/instances/69031436-19d1-4cc1-91e7-4d99381b6ae3_del complete Apr 21 10:53:02 user nova-compute[70954]: INFO nova.compute.manager [None req-ef4b6300-035d-42a2-8468-d856f091f737 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Took 0.91 seconds to destroy the instance on the hypervisor. Apr 21 10:53:02 user nova-compute[70954]: DEBUG oslo.service.loopingcall [None req-ef4b6300-035d-42a2-8468-d856f091f737 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70954) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 10:53:02 user nova-compute[70954]: DEBUG nova.compute.manager [-] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Deallocating network for instance {{(pid=70954) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 10:53:02 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] deallocate_for_instance() {{(pid=70954) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 10:53:03 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Updating instance_info_cache with network_info: [{"id": "2a49817a-aed1-49bd-96b6-36286ff71e1c", "address": "fa:16:3e:4f:2d:82", "network": {"id": "cfb4de90-44ea-486a-b5c4-c3b1111aa2bd", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1667019531-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "aad84a0e014f47ddaeaddc88bf16b0a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a49817a-ae", "ovs_interfaceid": "2a49817a-aed1-49bd-96b6-36286ff71e1c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:53:03 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Releasing lock "refresh_cache-84b55fc0-e748-4c05-97ad-a6994c0487d2" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:53:03 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Updated the network info_cache for instance {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 21 10:53:03 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:53:03 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Updating instance_info_cache with network_info: [] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:53:03 user nova-compute[70954]: DEBUG nova.compute.manager [req-e7219855-2970-4727-8abf-f9b7c2d1e9c1 req-31315684-e288-46ab-a413-84220721ead9 service nova] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Received event network-vif-deleted-516728bc-fbfa-4318-bbac-6c94509ee008 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:53:03 user nova-compute[70954]: INFO nova.compute.manager [req-e7219855-2970-4727-8abf-f9b7c2d1e9c1 req-31315684-e288-46ab-a413-84220721ead9 service nova] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Neutron deleted interface 516728bc-fbfa-4318-bbac-6c94509ee008; detaching it from the instance and deleting it from the info cache Apr 21 10:53:03 user nova-compute[70954]: DEBUG nova.network.neutron [req-e7219855-2970-4727-8abf-f9b7c2d1e9c1 req-31315684-e288-46ab-a413-84220721ead9 service nova] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Updating instance_info_cache with network_info: [] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:53:03 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Took 0.76 seconds to deallocate network for instance. Apr 21 10:53:03 user nova-compute[70954]: DEBUG nova.compute.manager [req-e7219855-2970-4727-8abf-f9b7c2d1e9c1 req-31315684-e288-46ab-a413-84220721ead9 service nova] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Detach interface failed, port_id=516728bc-fbfa-4318-bbac-6c94509ee008, reason: Instance 69031436-19d1-4cc1-91e7-4d99381b6ae3 could not be found. {{(pid=70954) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 21 10:53:03 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ef4b6300-035d-42a2-8468-d856f091f737 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:53:03 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ef4b6300-035d-42a2-8468-d856f091f737 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:53:03 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:53:03 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager.update_available_resource {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:53:03 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:53:03 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-ef4b6300-035d-42a2-8468-d856f091f737 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:53:03 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-ef4b6300-035d-42a2-8468-d856f091f737 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:53:04 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ef4b6300-035d-42a2-8468-d856f091f737 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.223s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:53:04 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.145s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:53:04 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:53:04 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Auditing locally available compute resources for user (node: user) {{(pid=70954) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 10:53:04 user nova-compute[70954]: INFO nova.scheduler.client.report [None req-ef4b6300-035d-42a2-8468-d856f091f737 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Deleted allocations for instance 69031436-19d1-4cc1-91e7-4d99381b6ae3 Apr 21 10:53:04 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:04 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ef4b6300-035d-42a2-8468-d856f091f737 tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Lock "69031436-19d1-4cc1-91e7-4d99381b6ae3" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.082s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:53:04 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:53:04 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk --force-share --output=json" returned: 0 in 0.145s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:53:04 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:53:04 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:53:04 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:53:04 user nova-compute[70954]: DEBUG nova.compute.manager [req-3b03bf1f-d441-499f-9a33-b72a9cf2f3f7 req-b10c73aa-98c6-42e2-abb9-34cf7560f6fb service nova] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Received event network-vif-plugged-516728bc-fbfa-4318-bbac-6c94509ee008 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:53:04 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-3b03bf1f-d441-499f-9a33-b72a9cf2f3f7 req-b10c73aa-98c6-42e2-abb9-34cf7560f6fb service nova] Acquiring lock "69031436-19d1-4cc1-91e7-4d99381b6ae3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:53:04 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-3b03bf1f-d441-499f-9a33-b72a9cf2f3f7 req-b10c73aa-98c6-42e2-abb9-34cf7560f6fb service nova] Lock "69031436-19d1-4cc1-91e7-4d99381b6ae3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:53:04 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-3b03bf1f-d441-499f-9a33-b72a9cf2f3f7 req-b10c73aa-98c6-42e2-abb9-34cf7560f6fb service nova] Lock "69031436-19d1-4cc1-91e7-4d99381b6ae3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:53:04 user nova-compute[70954]: DEBUG nova.compute.manager [req-3b03bf1f-d441-499f-9a33-b72a9cf2f3f7 req-b10c73aa-98c6-42e2-abb9-34cf7560f6fb service nova] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] No waiting events found dispatching network-vif-plugged-516728bc-fbfa-4318-bbac-6c94509ee008 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:53:04 user nova-compute[70954]: WARNING nova.compute.manager [req-3b03bf1f-d441-499f-9a33-b72a9cf2f3f7 req-b10c73aa-98c6-42e2-abb9-34cf7560f6fb service nova] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Received unexpected event network-vif-plugged-516728bc-fbfa-4318-bbac-6c94509ee008 for instance with vm_state deleted and task_state None. Apr 21 10:53:04 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:53:04 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:53:04 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:53:04 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/14bd5401-4cc1-4827-8d4a-fd1358bb9c6b/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:53:04 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/14bd5401-4cc1-4827-8d4a-fd1358bb9c6b/disk --force-share --output=json" returned: 0 in 0.152s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:53:04 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/14bd5401-4cc1-4827-8d4a-fd1358bb9c6b/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:53:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/14bd5401-4cc1-4827-8d4a-fd1358bb9c6b/disk --force-share --output=json" returned: 0 in 0.151s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:53:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f4dda568-8f3b-40eb-aff3-64d3e759c310/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:53:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f4dda568-8f3b-40eb-aff3-64d3e759c310/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:53:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f4dda568-8f3b-40eb-aff3-64d3e759c310/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:53:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f4dda568-8f3b-40eb-aff3-64d3e759c310/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:53:05 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Acquiring lock "d28e1e38-3ed5-468e-b672-8b94a909820c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:53:05 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "d28e1e38-3ed5-468e-b672-8b94a909820c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:53:05 user nova-compute[70954]: DEBUG nova.compute.manager [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Starting instance... {{(pid=70954) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 10:53:05 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:53:05 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:53:05 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Hypervisor/Node resource view: name=user free_ram=8668MB free_disk=26.45824432373047GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70954) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 10:53:05 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:53:05 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:53:05 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 84b55fc0-e748-4c05-97ad-a6994c0487d2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:53:05 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance f8609da3-c26d-482a-bc03-017baf4bce22 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:53:05 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance f4dda568-8f3b-40eb-aff3-64d3e759c310 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:53:05 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:53:05 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:53:05 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance d28e1e38-3ed5-468e-b672-8b94a909820c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} Apr 21 10:53:05 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Total usable vcpus: 12, total allocated vcpus: 4 {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 10:53:05 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Final resource view: name=user phys_ram=16023MB used_ram=1024MB phys_disk=40GB used_disk=4GB total_vcpus=12 used_vcpus=4 pci_stats=[] {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 10:53:06 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:53:06 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:53:06 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Compute_service record updated for user:user {{(pid=70954) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 10:53:06 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.322s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:53:06 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.227s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:53:06 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:53:06 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Cleaning up deleted instances {{(pid=70954) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 21 10:53:06 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70954) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 10:53:06 user nova-compute[70954]: INFO nova.compute.claims [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Claim successful on node user Apr 21 10:53:06 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] There are 0 instances to clean {{(pid=70954) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 21 10:53:06 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:53:06 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:53:06 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:53:06 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.292s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:53:06 user nova-compute[70954]: DEBUG nova.compute.manager [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Start building networks asynchronously for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 10:53:06 user nova-compute[70954]: DEBUG nova.compute.manager [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Allocating IP information in the background. {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 10:53:06 user nova-compute[70954]: DEBUG nova.network.neutron [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] allocate_for_instance() {{(pid=70954) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 10:53:06 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 10:53:06 user nova-compute[70954]: DEBUG nova.compute.manager [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Start building block device mappings for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 10:53:06 user nova-compute[70954]: DEBUG nova.policy [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd7fc66871488428e9842404d885bcfe3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '14bc6b0c20204c8287b3523814007856', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70954) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 10:53:06 user nova-compute[70954]: DEBUG nova.compute.manager [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Start spawning the instance on the hypervisor. {{(pid=70954) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 10:53:06 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Creating instance directory {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 10:53:06 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Creating image(s) Apr 21 10:53:06 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Acquiring lock "/opt/stack/data/nova/instances/d28e1e38-3ed5-468e-b672-8b94a909820c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:53:06 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "/opt/stack/data/nova/instances/d28e1e38-3ed5-468e-b672-8b94a909820c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:53:06 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "/opt/stack/data/nova/instances/d28e1e38-3ed5-468e-b672-8b94a909820c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:53:06 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:53:06 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.128s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:53:06 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Acquiring lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:53:06 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:53:06 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:53:06 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.155s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:53:06 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/d28e1e38-3ed5-468e-b672-8b94a909820c/disk 1073741824 {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:53:06 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/d28e1e38-3ed5-468e-b672-8b94a909820c/disk 1073741824" returned: 0 in 0.047s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:53:06 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.211s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:53:06 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:53:07 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:07 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.161s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:53:07 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Checking if we can resize image /opt/stack/data/nova/instances/d28e1e38-3ed5-468e-b672-8b94a909820c/disk. size=1073741824 {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 10:53:07 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d28e1e38-3ed5-468e-b672-8b94a909820c/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:53:07 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:53:07 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:53:07 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:53:07 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d28e1e38-3ed5-468e-b672-8b94a909820c/disk --force-share --output=json" returned: 0 in 0.186s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:53:07 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Cannot resize image /opt/stack/data/nova/instances/d28e1e38-3ed5-468e-b672-8b94a909820c/disk to a smaller size. {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 10:53:07 user nova-compute[70954]: DEBUG nova.objects.instance [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lazy-loading 'migration_context' on Instance uuid d28e1e38-3ed5-468e-b672-8b94a909820c {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:53:07 user nova-compute[70954]: DEBUG nova.network.neutron [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Successfully created port: 92fab70c-ebc3-4119-8fa9-874304b51cb5 {{(pid=70954) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 10:53:07 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Created local disks {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 10:53:07 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Ensure instance console log exists: /opt/stack/data/nova/instances/d28e1e38-3ed5-468e-b672-8b94a909820c/console.log {{(pid=70954) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 10:53:07 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:53:07 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:53:07 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:53:07 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:53:07 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:07 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:07 user nova-compute[70954]: DEBUG nova.network.neutron [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Successfully updated port: 92fab70c-ebc3-4119-8fa9-874304b51cb5 {{(pid=70954) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 10:53:07 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Acquiring lock "refresh_cache-d28e1e38-3ed5-468e-b672-8b94a909820c" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:53:07 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Acquired lock "refresh_cache-d28e1e38-3ed5-468e-b672-8b94a909820c" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:53:07 user nova-compute[70954]: DEBUG nova.network.neutron [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Building network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 10:53:08 user nova-compute[70954]: DEBUG nova.compute.manager [req-24c434ed-7de8-4572-817d-553ed6c28218 req-606e6070-45bf-4acf-a988-e4ed8384e288 service nova] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Received event network-changed-92fab70c-ebc3-4119-8fa9-874304b51cb5 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:53:08 user nova-compute[70954]: DEBUG nova.compute.manager [req-24c434ed-7de8-4572-817d-553ed6c28218 req-606e6070-45bf-4acf-a988-e4ed8384e288 service nova] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Refreshing instance network info cache due to event network-changed-92fab70c-ebc3-4119-8fa9-874304b51cb5. {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 10:53:08 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-24c434ed-7de8-4572-817d-553ed6c28218 req-606e6070-45bf-4acf-a988-e4ed8384e288 service nova] Acquiring lock "refresh_cache-d28e1e38-3ed5-468e-b672-8b94a909820c" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:53:08 user nova-compute[70954]: DEBUG nova.network.neutron [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Instance cache missing network info. {{(pid=70954) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 10:53:08 user nova-compute[70954]: DEBUG nova.network.neutron [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Updating instance_info_cache with network_info: [{"id": "92fab70c-ebc3-4119-8fa9-874304b51cb5", "address": "fa:16:3e:13:04:3b", "network": {"id": "e0ccd2d9-69df-40e0-be8e-8328039f1bd0", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-587901453-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "14bc6b0c20204c8287b3523814007856", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap92fab70c-eb", "ovs_interfaceid": "92fab70c-ebc3-4119-8fa9-874304b51cb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:53:08 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Releasing lock "refresh_cache-d28e1e38-3ed5-468e-b672-8b94a909820c" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:53:08 user nova-compute[70954]: DEBUG nova.compute.manager [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Instance network_info: |[{"id": "92fab70c-ebc3-4119-8fa9-874304b51cb5", "address": "fa:16:3e:13:04:3b", "network": {"id": "e0ccd2d9-69df-40e0-be8e-8328039f1bd0", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-587901453-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "14bc6b0c20204c8287b3523814007856", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap92fab70c-eb", "ovs_interfaceid": "92fab70c-ebc3-4119-8fa9-874304b51cb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 10:53:08 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-24c434ed-7de8-4572-817d-553ed6c28218 req-606e6070-45bf-4acf-a988-e4ed8384e288 service nova] Acquired lock "refresh_cache-d28e1e38-3ed5-468e-b672-8b94a909820c" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:53:08 user nova-compute[70954]: DEBUG nova.network.neutron [req-24c434ed-7de8-4572-817d-553ed6c28218 req-606e6070-45bf-4acf-a988-e4ed8384e288 service nova] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Refreshing network info cache for port 92fab70c-ebc3-4119-8fa9-874304b51cb5 {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 10:53:08 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Start _get_guest_xml network_info=[{"id": "92fab70c-ebc3-4119-8fa9-874304b51cb5", "address": "fa:16:3e:13:04:3b", "network": {"id": "e0ccd2d9-69df-40e0-be8e-8328039f1bd0", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-587901453-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "14bc6b0c20204c8287b3523814007856", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap92fab70c-eb", "ovs_interfaceid": "92fab70c-ebc3-4119-8fa9-874304b51cb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:43:25Z,direct_url=,disk_format='qcow2',id=3b29a01a-1fc0-4d0d-89fb-23d22b2de02e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='a3109aa78f014d0da3638064a889676d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:43:26Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'size': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'disk_bus': 'virtio', 'encryption_format': None, 'guest_format': None, 'image_id': '3b29a01a-1fc0-4d0d-89fb-23d22b2de02e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 10:53:08 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:53:08 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:53:08 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70954) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 10:53:08 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T10:44:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:43:25Z,direct_url=,disk_format='qcow2',id=3b29a01a-1fc0-4d0d-89fb-23d22b2de02e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='a3109aa78f014d0da3638064a889676d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:43:26Z,virtual_size=,visibility=), allow threads: True {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 10:53:08 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Flavor limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 10:53:08 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Image limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 10:53:08 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Flavor pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 10:53:08 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Image pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 10:53:08 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 10:53:08 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 10:53:08 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 10:53:08 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Got 1 possible topologies {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 10:53:08 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 10:53:08 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 10:53:08 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:53:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-2104835370',display_name='tempest-AttachVolumeNegativeTest-server-2104835370',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-2104835370',id=15,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFgvsOkIKKUlR/vc13NXCoQlDdgzN+g6wKzZajODT/6/BJRfaLwQgnuA3mDp4OA0MDn0gizzj2Pl2nG82WPwPorkuBJwINQHac9OchhGaDq4Fh1FeHhPPumdwSQw3oc8bA==',key_name='tempest-keypair-962586907',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='14bc6b0c20204c8287b3523814007856',ramdisk_id='',reservation_id='r-j20flv9j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-159654333',owner_user_name='tempest-AttachVolumeNegativeTest-159654333-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:53:06Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d7fc66871488428e9842404d885bcfe3',uuid=d28e1e38-3ed5-468e-b672-8b94a909820c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "92fab70c-ebc3-4119-8fa9-874304b51cb5", "address": "fa:16:3e:13:04:3b", "network": {"id": "e0ccd2d9-69df-40e0-be8e-8328039f1bd0", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-587901453-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "14bc6b0c20204c8287b3523814007856", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap92fab70c-eb", "ovs_interfaceid": "92fab70c-ebc3-4119-8fa9-874304b51cb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70954) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 10:53:08 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Converting VIF {"id": "92fab70c-ebc3-4119-8fa9-874304b51cb5", "address": "fa:16:3e:13:04:3b", "network": {"id": "e0ccd2d9-69df-40e0-be8e-8328039f1bd0", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-587901453-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "14bc6b0c20204c8287b3523814007856", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap92fab70c-eb", "ovs_interfaceid": "92fab70c-ebc3-4119-8fa9-874304b51cb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:53:08 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:13:04:3b,bridge_name='br-int',has_traffic_filtering=True,id=92fab70c-ebc3-4119-8fa9-874304b51cb5,network=Network(e0ccd2d9-69df-40e0-be8e-8328039f1bd0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap92fab70c-eb') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:53:08 user nova-compute[70954]: DEBUG nova.objects.instance [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lazy-loading 'pci_devices' on Instance uuid d28e1e38-3ed5-468e-b672-8b94a909820c {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:53:08 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] End _get_guest_xml xml= Apr 21 10:53:08 user nova-compute[70954]: d28e1e38-3ed5-468e-b672-8b94a909820c Apr 21 10:53:08 user nova-compute[70954]: instance-0000000f Apr 21 10:53:08 user nova-compute[70954]: 131072 Apr 21 10:53:08 user nova-compute[70954]: 1 Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: tempest-AttachVolumeNegativeTest-server-2104835370 Apr 21 10:53:08 user nova-compute[70954]: 2023-04-21 10:53:08 Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: 128 Apr 21 10:53:08 user nova-compute[70954]: 1 Apr 21 10:53:08 user nova-compute[70954]: 0 Apr 21 10:53:08 user nova-compute[70954]: 0 Apr 21 10:53:08 user nova-compute[70954]: 1 Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: tempest-AttachVolumeNegativeTest-159654333-project-member Apr 21 10:53:08 user nova-compute[70954]: tempest-AttachVolumeNegativeTest-159654333 Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: OpenStack Foundation Apr 21 10:53:08 user nova-compute[70954]: OpenStack Nova Apr 21 10:53:08 user nova-compute[70954]: 0.0.0 Apr 21 10:53:08 user nova-compute[70954]: d28e1e38-3ed5-468e-b672-8b94a909820c Apr 21 10:53:08 user nova-compute[70954]: d28e1e38-3ed5-468e-b672-8b94a909820c Apr 21 10:53:08 user nova-compute[70954]: Virtual Machine Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: hvm Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Nehalem Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: /dev/urandom Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: Apr 21 10:53:08 user nova-compute[70954]: {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 10:53:08 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:53:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-2104835370',display_name='tempest-AttachVolumeNegativeTest-server-2104835370',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-2104835370',id=15,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFgvsOkIKKUlR/vc13NXCoQlDdgzN+g6wKzZajODT/6/BJRfaLwQgnuA3mDp4OA0MDn0gizzj2Pl2nG82WPwPorkuBJwINQHac9OchhGaDq4Fh1FeHhPPumdwSQw3oc8bA==',key_name='tempest-keypair-962586907',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='14bc6b0c20204c8287b3523814007856',ramdisk_id='',reservation_id='r-j20flv9j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-159654333',owner_user_name='tempest-AttachVolumeNegativeTest-159654333-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:53:06Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d7fc66871488428e9842404d885bcfe3',uuid=d28e1e38-3ed5-468e-b672-8b94a909820c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "92fab70c-ebc3-4119-8fa9-874304b51cb5", "address": "fa:16:3e:13:04:3b", "network": {"id": "e0ccd2d9-69df-40e0-be8e-8328039f1bd0", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-587901453-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "14bc6b0c20204c8287b3523814007856", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap92fab70c-eb", "ovs_interfaceid": "92fab70c-ebc3-4119-8fa9-874304b51cb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 10:53:08 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Converting VIF {"id": "92fab70c-ebc3-4119-8fa9-874304b51cb5", "address": "fa:16:3e:13:04:3b", "network": {"id": "e0ccd2d9-69df-40e0-be8e-8328039f1bd0", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-587901453-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "14bc6b0c20204c8287b3523814007856", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap92fab70c-eb", "ovs_interfaceid": "92fab70c-ebc3-4119-8fa9-874304b51cb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:53:08 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:13:04:3b,bridge_name='br-int',has_traffic_filtering=True,id=92fab70c-ebc3-4119-8fa9-874304b51cb5,network=Network(e0ccd2d9-69df-40e0-be8e-8328039f1bd0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap92fab70c-eb') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:53:08 user nova-compute[70954]: DEBUG os_vif [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:13:04:3b,bridge_name='br-int',has_traffic_filtering=True,id=92fab70c-ebc3-4119-8fa9-874304b51cb5,network=Network(e0ccd2d9-69df-40e0-be8e-8328039f1bd0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap92fab70c-eb') {{(pid=70954) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 10:53:08 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:08 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:53:08 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 10:53:08 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:08 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap92fab70c-eb, may_exist=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:53:08 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap92fab70c-eb, col_values=(('external_ids', {'iface-id': '92fab70c-ebc3-4119-8fa9-874304b51cb5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:13:04:3b', 'vm-uuid': 'd28e1e38-3ed5-468e-b672-8b94a909820c'}),)) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:53:08 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:08 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:53:08 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:08 user nova-compute[70954]: INFO os_vif [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:13:04:3b,bridge_name='br-int',has_traffic_filtering=True,id=92fab70c-ebc3-4119-8fa9-874304b51cb5,network=Network(e0ccd2d9-69df-40e0-be8e-8328039f1bd0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap92fab70c-eb') Apr 21 10:53:08 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] No BDM found with device name vda, not building metadata. {{(pid=70954) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 10:53:08 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] No VIF found with MAC fa:16:3e:13:04:3b, not building metadata {{(pid=70954) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 10:53:08 user nova-compute[70954]: DEBUG nova.network.neutron [req-24c434ed-7de8-4572-817d-553ed6c28218 req-606e6070-45bf-4acf-a988-e4ed8384e288 service nova] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Updated VIF entry in instance network info cache for port 92fab70c-ebc3-4119-8fa9-874304b51cb5. {{(pid=70954) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 10:53:08 user nova-compute[70954]: DEBUG nova.network.neutron [req-24c434ed-7de8-4572-817d-553ed6c28218 req-606e6070-45bf-4acf-a988-e4ed8384e288 service nova] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Updating instance_info_cache with network_info: [{"id": "92fab70c-ebc3-4119-8fa9-874304b51cb5", "address": "fa:16:3e:13:04:3b", "network": {"id": "e0ccd2d9-69df-40e0-be8e-8328039f1bd0", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-587901453-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "14bc6b0c20204c8287b3523814007856", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap92fab70c-eb", "ovs_interfaceid": "92fab70c-ebc3-4119-8fa9-874304b51cb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:53:08 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-24c434ed-7de8-4572-817d-553ed6c28218 req-606e6070-45bf-4acf-a988-e4ed8384e288 service nova] Releasing lock "refresh_cache-d28e1e38-3ed5-468e-b672-8b94a909820c" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:53:09 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:09 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:09 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:10 user nova-compute[70954]: DEBUG nova.compute.manager [req-f23726ef-2ca6-4c1d-90d2-c58f9cb9fb21 req-abea490d-b9e9-4323-9385-99ca4d39c3dc service nova] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Received event network-vif-plugged-92fab70c-ebc3-4119-8fa9-874304b51cb5 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:53:10 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-f23726ef-2ca6-4c1d-90d2-c58f9cb9fb21 req-abea490d-b9e9-4323-9385-99ca4d39c3dc service nova] Acquiring lock "d28e1e38-3ed5-468e-b672-8b94a909820c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:53:10 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-f23726ef-2ca6-4c1d-90d2-c58f9cb9fb21 req-abea490d-b9e9-4323-9385-99ca4d39c3dc service nova] Lock "d28e1e38-3ed5-468e-b672-8b94a909820c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:53:10 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-f23726ef-2ca6-4c1d-90d2-c58f9cb9fb21 req-abea490d-b9e9-4323-9385-99ca4d39c3dc service nova] Lock "d28e1e38-3ed5-468e-b672-8b94a909820c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:53:10 user nova-compute[70954]: DEBUG nova.compute.manager [req-f23726ef-2ca6-4c1d-90d2-c58f9cb9fb21 req-abea490d-b9e9-4323-9385-99ca4d39c3dc service nova] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] No waiting events found dispatching network-vif-plugged-92fab70c-ebc3-4119-8fa9-874304b51cb5 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:53:10 user nova-compute[70954]: WARNING nova.compute.manager [req-f23726ef-2ca6-4c1d-90d2-c58f9cb9fb21 req-abea490d-b9e9-4323-9385-99ca4d39c3dc service nova] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Received unexpected event network-vif-plugged-92fab70c-ebc3-4119-8fa9-874304b51cb5 for instance with vm_state building and task_state spawning. Apr 21 10:53:10 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:10 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:10 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:10 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:10 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:53:10 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70954) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 10:53:12 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Resumed> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:53:12 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] VM Resumed (Lifecycle Event) Apr 21 10:53:12 user nova-compute[70954]: DEBUG nova.compute.manager [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Instance event wait completed in 0 seconds for {{(pid=70954) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 10:53:12 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Guest created on hypervisor {{(pid=70954) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 10:53:12 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Instance spawned successfully. Apr 21 10:53:12 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 10:53:12 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:53:12 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:53:12 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Found default for hw_cdrom_bus of ide {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:53:12 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Found default for hw_disk_bus of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:53:12 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Found default for hw_input_bus of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:53:12 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Found default for hw_pointer_model of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:53:12 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Found default for hw_video_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:53:12 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Found default for hw_vif_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:53:12 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 10:53:12 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Started> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:53:12 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] VM Started (Lifecycle Event) Apr 21 10:53:12 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:53:12 user nova-compute[70954]: DEBUG nova.compute.manager [req-3710d2cb-ec77-4c2b-9e7a-d656683eb3aa req-e8454d5f-69ed-43e6-bde8-493507e654e8 service nova] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Received event network-vif-plugged-92fab70c-ebc3-4119-8fa9-874304b51cb5 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:53:12 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-3710d2cb-ec77-4c2b-9e7a-d656683eb3aa req-e8454d5f-69ed-43e6-bde8-493507e654e8 service nova] Acquiring lock "d28e1e38-3ed5-468e-b672-8b94a909820c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:53:12 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-3710d2cb-ec77-4c2b-9e7a-d656683eb3aa req-e8454d5f-69ed-43e6-bde8-493507e654e8 service nova] Lock "d28e1e38-3ed5-468e-b672-8b94a909820c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:53:12 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-3710d2cb-ec77-4c2b-9e7a-d656683eb3aa req-e8454d5f-69ed-43e6-bde8-493507e654e8 service nova] Lock "d28e1e38-3ed5-468e-b672-8b94a909820c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:53:12 user nova-compute[70954]: DEBUG nova.compute.manager [req-3710d2cb-ec77-4c2b-9e7a-d656683eb3aa req-e8454d5f-69ed-43e6-bde8-493507e654e8 service nova] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] No waiting events found dispatching network-vif-plugged-92fab70c-ebc3-4119-8fa9-874304b51cb5 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:53:12 user nova-compute[70954]: WARNING nova.compute.manager [req-3710d2cb-ec77-4c2b-9e7a-d656683eb3aa req-e8454d5f-69ed-43e6-bde8-493507e654e8 service nova] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Received unexpected event network-vif-plugged-92fab70c-ebc3-4119-8fa9-874304b51cb5 for instance with vm_state building and task_state spawning. Apr 21 10:53:12 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:53:12 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 10:53:12 user nova-compute[70954]: INFO nova.compute.manager [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Took 5.62 seconds to spawn the instance on the hypervisor. Apr 21 10:53:12 user nova-compute[70954]: DEBUG nova.compute.manager [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:53:12 user nova-compute[70954]: INFO nova.compute.manager [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Took 6.46 seconds to build instance. Apr 21 10:53:12 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1481f221-045e-4ef1-b280-abc7e77cab95 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "d28e1e38-3ed5-468e-b672-8b94a909820c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.911s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:53:12 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:13 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:14 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:17 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:17 user nova-compute[70954]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:53:17 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] VM Stopped (Lifecycle Event) Apr 21 10:53:17 user nova-compute[70954]: DEBUG nova.compute.manager [None req-0310aa1b-b80c-4688-98f6-269e6b1443a8 None None] [instance: 69031436-19d1-4cc1-91e7-4d99381b6ae3] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:53:17 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:18 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:21 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:22 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:23 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:23 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:27 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:27 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:28 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:30 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:32 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:33 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:37 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:38 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:42 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:43 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:48 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:53:51 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Acquiring lock "a5418b3c-8587-4544-b562-fe01a69be3fc" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:53:51 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "a5418b3c-8587-4544-b562-fe01a69be3fc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:53:51 user nova-compute[70954]: DEBUG nova.compute.manager [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] Starting instance... {{(pid=70954) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 10:53:52 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:53:52 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:53:52 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70954) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 10:53:52 user nova-compute[70954]: INFO nova.compute.claims [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] Claim successful on node user Apr 21 10:53:52 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:53:52 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:53:52 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.309s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:53:52 user nova-compute[70954]: DEBUG nova.compute.manager [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] Start building networks asynchronously for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 10:53:52 user nova-compute[70954]: DEBUG nova.compute.manager [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] Allocating IP information in the background. {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 10:53:52 user nova-compute[70954]: DEBUG nova.network.neutron [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] allocate_for_instance() {{(pid=70954) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 10:53:52 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 10:53:52 user nova-compute[70954]: DEBUG nova.compute.manager [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] Start building block device mappings for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 10:53:52 user nova-compute[70954]: INFO nova.virt.block_device [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] Booting with volume-backed-image 3b29a01a-1fc0-4d0d-89fb-23d22b2de02e at /dev/vda Apr 21 10:53:52 user nova-compute[70954]: DEBUG nova.policy [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '54c67d90b6014d9ea24ef2552006bc04', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'aad84a0e014f47ddaeaddc88bf16b0a8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70954) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 10:53:52 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:53 user nova-compute[70954]: DEBUG nova.network.neutron [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] Successfully created port: e67a2be9-b6b6-468f-867d-0e4930f5d56d {{(pid=70954) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 10:53:53 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:53 user nova-compute[70954]: DEBUG nova.network.neutron [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] Successfully updated port: e67a2be9-b6b6-468f-867d-0e4930f5d56d {{(pid=70954) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 10:53:53 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Acquiring lock "refresh_cache-a5418b3c-8587-4544-b562-fe01a69be3fc" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:53:53 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Acquired lock "refresh_cache-a5418b3c-8587-4544-b562-fe01a69be3fc" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:53:53 user nova-compute[70954]: DEBUG nova.network.neutron [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] Building network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 10:53:54 user nova-compute[70954]: DEBUG nova.compute.manager [req-e5da14c6-800b-4043-86cb-15f9a082e625 req-39a8052b-e02a-414a-883c-2b41df5f9f35 service nova] [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] Received event network-changed-e67a2be9-b6b6-468f-867d-0e4930f5d56d {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:53:54 user nova-compute[70954]: DEBUG nova.compute.manager [req-e5da14c6-800b-4043-86cb-15f9a082e625 req-39a8052b-e02a-414a-883c-2b41df5f9f35 service nova] [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] Refreshing instance network info cache due to event network-changed-e67a2be9-b6b6-468f-867d-0e4930f5d56d. {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 10:53:54 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-e5da14c6-800b-4043-86cb-15f9a082e625 req-39a8052b-e02a-414a-883c-2b41df5f9f35 service nova] Acquiring lock "refresh_cache-a5418b3c-8587-4544-b562-fe01a69be3fc" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:53:54 user nova-compute[70954]: DEBUG nova.network.neutron [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] Instance cache missing network info. {{(pid=70954) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 10:53:54 user nova-compute[70954]: DEBUG nova.network.neutron [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] Updating instance_info_cache with network_info: [{"id": "e67a2be9-b6b6-468f-867d-0e4930f5d56d", "address": "fa:16:3e:c0:c4:0d", "network": {"id": "cfb4de90-44ea-486a-b5c4-c3b1111aa2bd", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1667019531-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "aad84a0e014f47ddaeaddc88bf16b0a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape67a2be9-b6", "ovs_interfaceid": "e67a2be9-b6b6-468f-867d-0e4930f5d56d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:53:54 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Releasing lock "refresh_cache-a5418b3c-8587-4544-b562-fe01a69be3fc" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:53:54 user nova-compute[70954]: DEBUG nova.compute.manager [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] Instance network_info: |[{"id": "e67a2be9-b6b6-468f-867d-0e4930f5d56d", "address": "fa:16:3e:c0:c4:0d", "network": {"id": "cfb4de90-44ea-486a-b5c4-c3b1111aa2bd", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1667019531-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "aad84a0e014f47ddaeaddc88bf16b0a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape67a2be9-b6", "ovs_interfaceid": "e67a2be9-b6b6-468f-867d-0e4930f5d56d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 10:53:54 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-e5da14c6-800b-4043-86cb-15f9a082e625 req-39a8052b-e02a-414a-883c-2b41df5f9f35 service nova] Acquired lock "refresh_cache-a5418b3c-8587-4544-b562-fe01a69be3fc" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:53:54 user nova-compute[70954]: DEBUG nova.network.neutron [req-e5da14c6-800b-4043-86cb-15f9a082e625 req-39a8052b-e02a-414a-883c-2b41df5f9f35 service nova] [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] Refreshing network info cache for port e67a2be9-b6b6-468f-867d-0e4930f5d56d {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 10:53:54 user nova-compute[70954]: DEBUG nova.network.neutron [req-e5da14c6-800b-4043-86cb-15f9a082e625 req-39a8052b-e02a-414a-883c-2b41df5f9f35 service nova] [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] Updated VIF entry in instance network info cache for port e67a2be9-b6b6-468f-867d-0e4930f5d56d. {{(pid=70954) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 10:53:54 user nova-compute[70954]: DEBUG nova.network.neutron [req-e5da14c6-800b-4043-86cb-15f9a082e625 req-39a8052b-e02a-414a-883c-2b41df5f9f35 service nova] [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] Updating instance_info_cache with network_info: [{"id": "e67a2be9-b6b6-468f-867d-0e4930f5d56d", "address": "fa:16:3e:c0:c4:0d", "network": {"id": "cfb4de90-44ea-486a-b5c4-c3b1111aa2bd", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1667019531-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "aad84a0e014f47ddaeaddc88bf16b0a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape67a2be9-b6", "ovs_interfaceid": "e67a2be9-b6b6-468f-867d-0e4930f5d56d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:53:54 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-e5da14c6-800b-4043-86cb-15f9a082e625 req-39a8052b-e02a-414a-883c-2b41df5f9f35 service nova] Releasing lock "refresh_cache-a5418b3c-8587-4544-b562-fe01a69be3fc" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:53:56 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Acquiring lock "f3e0bc01-1cf2-4ff9-bec6-12a37e44171c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:53:56 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Lock "f3e0bc01-1cf2-4ff9-bec6-12a37e44171c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:53:56 user nova-compute[70954]: DEBUG nova.compute.manager [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Starting instance... {{(pid=70954) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 10:53:56 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:53:56 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:53:56 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70954) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 10:53:56 user nova-compute[70954]: INFO nova.compute.claims [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Claim successful on node user Apr 21 10:53:56 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:53:56 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:53:56 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.326s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:53:56 user nova-compute[70954]: DEBUG nova.compute.manager [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Start building networks asynchronously for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 10:53:56 user nova-compute[70954]: DEBUG nova.compute.manager [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Allocating IP information in the background. {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 10:53:56 user nova-compute[70954]: DEBUG nova.network.neutron [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] allocate_for_instance() {{(pid=70954) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 10:53:56 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 10:53:56 user nova-compute[70954]: DEBUG nova.compute.manager [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Start building block device mappings for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 10:53:56 user nova-compute[70954]: DEBUG nova.policy [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6cae5a1734d24ac8aebc233dd31d3084', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9ead44a7da0640cbb2cf8dece0ea4f40', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70954) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 10:53:56 user nova-compute[70954]: DEBUG nova.compute.manager [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Start spawning the instance on the hypervisor. {{(pid=70954) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 10:53:56 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Creating instance directory {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 10:53:56 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Creating image(s) Apr 21 10:53:56 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Acquiring lock "/opt/stack/data/nova/instances/f3e0bc01-1cf2-4ff9-bec6-12a37e44171c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:53:56 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Lock "/opt/stack/data/nova/instances/f3e0bc01-1cf2-4ff9-bec6-12a37e44171c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:53:56 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Lock "/opt/stack/data/nova/instances/f3e0bc01-1cf2-4ff9-bec6-12a37e44171c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:53:56 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Acquiring lock "259e6260deb1d46b0984517d2a7bdb457842dba2" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:53:56 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Lock "259e6260deb1d46b0984517d2a7bdb457842dba2" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:53:57 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/259e6260deb1d46b0984517d2a7bdb457842dba2.part --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:53:57 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/259e6260deb1d46b0984517d2a7bdb457842dba2.part --force-share --output=json" returned: 0 in 0.134s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:53:57 user nova-compute[70954]: DEBUG nova.virt.images [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] d5ab5add-8fb7-4436-88e9-0cb945ddc863 was qcow2, converting to raw {{(pid=70954) fetch_to_raw /opt/stack/nova/nova/virt/images.py:165}} Apr 21 10:53:57 user nova-compute[70954]: DEBUG nova.privsep.utils [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=70954) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 21 10:53:57 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/259e6260deb1d46b0984517d2a7bdb457842dba2.part /opt/stack/data/nova/instances/_base/259e6260deb1d46b0984517d2a7bdb457842dba2.converted {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:53:57 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/259e6260deb1d46b0984517d2a7bdb457842dba2.part /opt/stack/data/nova/instances/_base/259e6260deb1d46b0984517d2a7bdb457842dba2.converted" returned: 0 in 0.267s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:53:57 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/259e6260deb1d46b0984517d2a7bdb457842dba2.converted --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:53:57 user nova-compute[70954]: DEBUG nova.network.neutron [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Successfully created port: ec8edd23-eb04-4e01-874f-7a5ad305eacc {{(pid=70954) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 10:53:57 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/259e6260deb1d46b0984517d2a7bdb457842dba2.converted --force-share --output=json" returned: 0 in 0.131s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:53:57 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Lock "259e6260deb1d46b0984517d2a7bdb457842dba2" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.972s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:53:57 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/259e6260deb1d46b0984517d2a7bdb457842dba2 --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:53:57 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/259e6260deb1d46b0984517d2a7bdb457842dba2 --force-share --output=json" returned: 0 in 0.128s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:53:57 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Acquiring lock "259e6260deb1d46b0984517d2a7bdb457842dba2" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:53:57 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Lock "259e6260deb1d46b0984517d2a7bdb457842dba2" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:53:57 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/259e6260deb1d46b0984517d2a7bdb457842dba2 --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:53:57 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/259e6260deb1d46b0984517d2a7bdb457842dba2 --force-share --output=json" returned: 0 in 0.129s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:53:57 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/259e6260deb1d46b0984517d2a7bdb457842dba2,backing_fmt=raw /opt/stack/data/nova/instances/f3e0bc01-1cf2-4ff9-bec6-12a37e44171c/disk 1073741824 {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:53:57 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/259e6260deb1d46b0984517d2a7bdb457842dba2,backing_fmt=raw /opt/stack/data/nova/instances/f3e0bc01-1cf2-4ff9-bec6-12a37e44171c/disk 1073741824" returned: 0 in 0.050s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:53:57 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Lock "259e6260deb1d46b0984517d2a7bdb457842dba2" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.185s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:53:57 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/259e6260deb1d46b0984517d2a7bdb457842dba2 --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:53:58 user nova-compute[70954]: WARNING nova.compute.manager [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Volume id: f5babf8e-2b57-4f29-a2fd-4eca76e3c72f finished being created but its status is error. Apr 21 10:53:58 user nova-compute[70954]: ERROR nova.compute.manager [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] Instance failed block device setup: nova.exception.VolumeNotCreated: Volume f5babf8e-2b57-4f29-a2fd-4eca76e3c72f did not finish being created even after we waited 5 seconds or 2 attempts. And its status is error. Apr 21 10:53:58 user nova-compute[70954]: ERROR nova.compute.manager [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] Traceback (most recent call last): Apr 21 10:53:58 user nova-compute[70954]: ERROR nova.compute.manager [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] File "/opt/stack/nova/nova/compute/manager.py", line 2175, in _prep_block_device Apr 21 10:53:58 user nova-compute[70954]: ERROR nova.compute.manager [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] driver_block_device.attach_block_devices( Apr 21 10:53:58 user nova-compute[70954]: ERROR nova.compute.manager [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] File "/opt/stack/nova/nova/virt/block_device.py", line 936, in attach_block_devices Apr 21 10:53:58 user nova-compute[70954]: ERROR nova.compute.manager [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] _log_and_attach(device) Apr 21 10:53:58 user nova-compute[70954]: ERROR nova.compute.manager [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] File "/opt/stack/nova/nova/virt/block_device.py", line 933, in _log_and_attach Apr 21 10:53:58 user nova-compute[70954]: ERROR nova.compute.manager [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] bdm.attach(*attach_args, **attach_kwargs) Apr 21 10:53:58 user nova-compute[70954]: ERROR nova.compute.manager [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] File "/opt/stack/nova/nova/virt/block_device.py", line 831, in attach Apr 21 10:53:58 user nova-compute[70954]: ERROR nova.compute.manager [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] self.volume_id, self.attachment_id = self._create_volume( Apr 21 10:53:58 user nova-compute[70954]: ERROR nova.compute.manager [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] File "/opt/stack/nova/nova/virt/block_device.py", line 435, in _create_volume Apr 21 10:53:58 user nova-compute[70954]: ERROR nova.compute.manager [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] self._call_wait_func(context, wait_func, volume_api, vol['id']) Apr 21 10:53:58 user nova-compute[70954]: ERROR nova.compute.manager [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] File "/opt/stack/nova/nova/virt/block_device.py", line 785, in _call_wait_func Apr 21 10:53:58 user nova-compute[70954]: ERROR nova.compute.manager [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] with excutils.save_and_reraise_exception(): Apr 21 10:53:58 user nova-compute[70954]: ERROR nova.compute.manager [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ Apr 21 10:53:58 user nova-compute[70954]: ERROR nova.compute.manager [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] self.force_reraise() Apr 21 10:53:58 user nova-compute[70954]: ERROR nova.compute.manager [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise Apr 21 10:53:58 user nova-compute[70954]: ERROR nova.compute.manager [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] raise self.value Apr 21 10:53:58 user nova-compute[70954]: ERROR nova.compute.manager [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] File "/opt/stack/nova/nova/virt/block_device.py", line 783, in _call_wait_func Apr 21 10:53:58 user nova-compute[70954]: ERROR nova.compute.manager [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] wait_func(context, volume_id) Apr 21 10:53:58 user nova-compute[70954]: ERROR nova.compute.manager [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] File "/opt/stack/nova/nova/compute/manager.py", line 1792, in _await_block_device_map_created Apr 21 10:53:58 user nova-compute[70954]: ERROR nova.compute.manager [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] raise exception.VolumeNotCreated(volume_id=vol_id, Apr 21 10:53:58 user nova-compute[70954]: ERROR nova.compute.manager [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] nova.exception.VolumeNotCreated: Volume f5babf8e-2b57-4f29-a2fd-4eca76e3c72f did not finish being created even after we waited 5 seconds or 2 attempts. And its status is error. Apr 21 10:53:58 user nova-compute[70954]: ERROR nova.compute.manager [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] Apr 21 10:53:58 user nova-compute[70954]: DEBUG nova.compute.claims [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] Aborting claim: {{(pid=70954) abort /opt/stack/nova/nova/compute/claims.py:84}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/259e6260deb1d46b0984517d2a7bdb457842dba2 --force-share --output=json" returned: 0 in 0.148s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Checking if we can resize image /opt/stack/data/nova/instances/f3e0bc01-1cf2-4ff9-bec6-12a37e44171c/disk. size=1073741824 {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f3e0bc01-1cf2-4ff9-bec6-12a37e44171c/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._sync_power_states {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f3e0bc01-1cf2-4ff9-bec6-12a37e44171c/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Cannot resize image /opt/stack/data/nova/instances/f3e0bc01-1cf2-4ff9-bec6-12a37e44171c/disk to a smaller size. {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG nova.objects.instance [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Lazy-loading 'migration_context' on Instance uuid f3e0bc01-1cf2-4ff9-bec6-12a37e44171c {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Created local disks {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Ensure instance console log exists: /opt/stack/data/nova/instances/f3e0bc01-1cf2-4ff9-bec6-12a37e44171c/console.log {{(pid=70954) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:53:58 user nova-compute[70954]: WARNING nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] While synchronizing instance power states, found 6 instances in the database and 5 instances on the hypervisor. Apr 21 10:53:58 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Triggering sync for uuid 84b55fc0-e748-4c05-97ad-a6994c0487d2 {{(pid=70954) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Triggering sync for uuid f8609da3-c26d-482a-bc03-017baf4bce22 {{(pid=70954) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Triggering sync for uuid f4dda568-8f3b-40eb-aff3-64d3e759c310 {{(pid=70954) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Triggering sync for uuid 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b {{(pid=70954) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Triggering sync for uuid d28e1e38-3ed5-468e-b672-8b94a909820c {{(pid=70954) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Triggering sync for uuid f3e0bc01-1cf2-4ff9-bec6-12a37e44171c {{(pid=70954) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "84b55fc0-e748-4c05-97ad-a6994c0487d2" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "84b55fc0-e748-4c05-97ad-a6994c0487d2" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "f8609da3-c26d-482a-bc03-017baf4bce22" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "f8609da3-c26d-482a-bc03-017baf4bce22" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "f4dda568-8f3b-40eb-aff3-64d3e759c310" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "f4dda568-8f3b-40eb-aff3-64d3e759c310" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "14bd5401-4cc1-4827-8d4a-fd1358bb9c6b" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "14bd5401-4cc1-4827-8d4a-fd1358bb9c6b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "d28e1e38-3ed5-468e-b672-8b94a909820c" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "d28e1e38-3ed5-468e-b672-8b94a909820c" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "f3e0bc01-1cf2-4ff9-bec6-12a37e44171c" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "84b55fc0-e748-4c05-97ad-a6994c0487d2" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.064s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "f4dda568-8f3b-40eb-aff3-64d3e759c310" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.065s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "f8609da3-c26d-482a-bc03-017baf4bce22" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.068s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "14bd5401-4cc1-4827-8d4a-fd1358bb9c6b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.068s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "d28e1e38-3ed5-468e-b672-8b94a909820c" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.085s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.388s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG nova.compute.manager [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] Build of instance a5418b3c-8587-4544-b562-fe01a69be3fc aborted: Volume f5babf8e-2b57-4f29-a2fd-4eca76e3c72f did not finish being created even after we waited 5 seconds or 2 attempts. And its status is error. {{(pid=70954) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2636}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG nova.compute.utils [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] Build of instance a5418b3c-8587-4544-b562-fe01a69be3fc aborted: Volume f5babf8e-2b57-4f29-a2fd-4eca76e3c72f did not finish being created even after we waited 5 seconds or 2 attempts. And its status is error. {{(pid=70954) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} Apr 21 10:53:58 user nova-compute[70954]: ERROR nova.compute.manager [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] Build of instance a5418b3c-8587-4544-b562-fe01a69be3fc aborted: Volume f5babf8e-2b57-4f29-a2fd-4eca76e3c72f did not finish being created even after we waited 5 seconds or 2 attempts. And its status is error.: nova.exception.BuildAbortException: Build of instance a5418b3c-8587-4544-b562-fe01a69be3fc aborted: Volume f5babf8e-2b57-4f29-a2fd-4eca76e3c72f did not finish being created even after we waited 5 seconds or 2 attempts. And its status is error. Apr 21 10:53:58 user nova-compute[70954]: DEBUG nova.compute.manager [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] Unplugging VIFs for instance {{(pid=70954) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:53:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1257500458',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1257500458',id=16,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='aad84a0e014f47ddaeaddc88bf16b0a8',ramdisk_id='',reservation_id='r-4rvxgcfz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-1980957418',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member'},tags=TagList,task_state='block_device_mapping',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:53:52Z,user_data=None,user_id='54c67d90b6014d9ea24ef2552006bc04',uuid=a5418b3c-8587-4544-b562-fe01a69be3fc,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e67a2be9-b6b6-468f-867d-0e4930f5d56d", "address": "fa:16:3e:c0:c4:0d", "network": {"id": "cfb4de90-44ea-486a-b5c4-c3b1111aa2bd", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1667019531-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "aad84a0e014f47ddaeaddc88bf16b0a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape67a2be9-b6", "ovs_interfaceid": "e67a2be9-b6b6-468f-867d-0e4930f5d56d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Converting VIF {"id": "e67a2be9-b6b6-468f-867d-0e4930f5d56d", "address": "fa:16:3e:c0:c4:0d", "network": {"id": "cfb4de90-44ea-486a-b5c4-c3b1111aa2bd", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1667019531-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "aad84a0e014f47ddaeaddc88bf16b0a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape67a2be9-b6", "ovs_interfaceid": "e67a2be9-b6b6-468f-867d-0e4930f5d56d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c0:c4:0d,bridge_name='br-int',has_traffic_filtering=True,id=e67a2be9-b6b6-468f-867d-0e4930f5d56d,network=Network(cfb4de90-44ea-486a-b5c4-c3b1111aa2bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape67a2be9-b6') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG os_vif [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c0:c4:0d,bridge_name='br-int',has_traffic_filtering=True,id=e67a2be9-b6b6-468f-867d-0e4930f5d56d,network=Network(cfb4de90-44ea-486a-b5c4-c3b1111aa2bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape67a2be9-b6') {{(pid=70954) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape67a2be9-b6, bridge=br-int, if_exists=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 10:53:58 user nova-compute[70954]: INFO os_vif [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c0:c4:0d,bridge_name='br-int',has_traffic_filtering=True,id=e67a2be9-b6b6-468f-867d-0e4930f5d56d,network=Network(cfb4de90-44ea-486a-b5c4-c3b1111aa2bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape67a2be9-b6') Apr 21 10:53:58 user nova-compute[70954]: DEBUG nova.compute.manager [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] Unplugged VIFs for instance {{(pid=70954) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG nova.compute.manager [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] Deallocating network for instance {{(pid=70954) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG nova.network.neutron [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] deallocate_for_instance() {{(pid=70954) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 129-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=70954) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG nova.network.neutron [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Successfully updated port: ec8edd23-eb04-4e01-874f-7a5ad305eacc {{(pid=70954) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Acquiring lock "refresh_cache-f3e0bc01-1cf2-4ff9-bec6-12a37e44171c" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Acquired lock "refresh_cache-f3e0bc01-1cf2-4ff9-bec6-12a37e44171c" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG nova.network.neutron [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Building network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG nova.compute.manager [req-34de482a-e735-444e-b332-c369dc5cb4db req-0f94b517-e037-4f18-b878-68956960ff18 service nova] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Received event network-changed-ec8edd23-eb04-4e01-874f-7a5ad305eacc {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG nova.compute.manager [req-34de482a-e735-444e-b332-c369dc5cb4db req-0f94b517-e037-4f18-b878-68956960ff18 service nova] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Refreshing instance network info cache due to event network-changed-ec8edd23-eb04-4e01-874f-7a5ad305eacc. {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-34de482a-e735-444e-b332-c369dc5cb4db req-0f94b517-e037-4f18-b878-68956960ff18 service nova] Acquiring lock "refresh_cache-f3e0bc01-1cf2-4ff9-bec6-12a37e44171c" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:53:58 user nova-compute[70954]: DEBUG nova.network.neutron [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Instance cache missing network info. {{(pid=70954) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 10:53:59 user nova-compute[70954]: DEBUG nova.network.neutron [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Updating instance_info_cache with network_info: [{"id": "ec8edd23-eb04-4e01-874f-7a5ad305eacc", "address": "fa:16:3e:62:75:ec", "network": {"id": "89cc600b-891d-4913-9f39-935f5c5bce86", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1522370995-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9ead44a7da0640cbb2cf8dece0ea4f40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapec8edd23-eb", "ovs_interfaceid": "ec8edd23-eb04-4e01-874f-7a5ad305eacc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:53:59 user nova-compute[70954]: DEBUG nova.network.neutron [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] Updating instance_info_cache with network_info: [] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:53:59 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Releasing lock "refresh_cache-f3e0bc01-1cf2-4ff9-bec6-12a37e44171c" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:53:59 user nova-compute[70954]: DEBUG nova.compute.manager [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Instance network_info: |[{"id": "ec8edd23-eb04-4e01-874f-7a5ad305eacc", "address": "fa:16:3e:62:75:ec", "network": {"id": "89cc600b-891d-4913-9f39-935f5c5bce86", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1522370995-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9ead44a7da0640cbb2cf8dece0ea4f40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapec8edd23-eb", "ovs_interfaceid": "ec8edd23-eb04-4e01-874f-7a5ad305eacc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 10:53:59 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-34de482a-e735-444e-b332-c369dc5cb4db req-0f94b517-e037-4f18-b878-68956960ff18 service nova] Acquired lock "refresh_cache-f3e0bc01-1cf2-4ff9-bec6-12a37e44171c" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:53:59 user nova-compute[70954]: DEBUG nova.network.neutron [req-34de482a-e735-444e-b332-c369dc5cb4db req-0f94b517-e037-4f18-b878-68956960ff18 service nova] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Refreshing network info cache for port ec8edd23-eb04-4e01-874f-7a5ad305eacc {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 10:53:59 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Start _get_guest_xml network_info=[{"id": "ec8edd23-eb04-4e01-874f-7a5ad305eacc", "address": "fa:16:3e:62:75:ec", "network": {"id": "89cc600b-891d-4913-9f39-935f5c5bce86", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1522370995-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9ead44a7da0640cbb2cf8dece0ea4f40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapec8edd23-eb", "ovs_interfaceid": "ec8edd23-eb04-4e01-874f-7a5ad305eacc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:53:53Z,direct_url=,disk_format='qcow2',id=d5ab5add-8fb7-4436-88e9-0cb945ddc863,min_disk=0,min_ram=0,name='tempest-scenario-img--1437401460',owner='9ead44a7da0640cbb2cf8dece0ea4f40',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:53:55Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'size': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'disk_bus': 'virtio', 'encryption_format': None, 'guest_format': None, 'image_id': 'd5ab5add-8fb7-4436-88e9-0cb945ddc863'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 10:53:59 user nova-compute[70954]: INFO nova.compute.manager [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: a5418b3c-8587-4544-b562-fe01a69be3fc] Took 0.96 seconds to deallocate network for instance. Apr 21 10:53:59 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:53:59 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:53:59 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70954) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 10:53:59 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T10:44:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:53:53Z,direct_url=,disk_format='qcow2',id=d5ab5add-8fb7-4436-88e9-0cb945ddc863,min_disk=0,min_ram=0,name='tempest-scenario-img--1437401460',owner='9ead44a7da0640cbb2cf8dece0ea4f40',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:53:55Z,virtual_size=,visibility=), allow threads: True {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 10:53:59 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Flavor limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 10:53:59 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Image limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 10:53:59 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Flavor pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 10:53:59 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Image pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 10:53:59 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 10:53:59 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 10:53:59 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 10:53:59 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Got 1 possible topologies {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 10:53:59 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 10:53:59 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 10:53:59 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:53:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-843445589',display_name='tempest-TestMinimumBasicScenario-server-843445589',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-843445589',id=17,image_ref='d5ab5add-8fb7-4436-88e9-0cb945ddc863',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG5kEQk6hWARzkdAxYq+PejmYD/iouSlgR4Vlro63NFT4BV2SLZZTcJkpc9dGXv7UnIQQu9gOJSnEV3QeSQLSLjalHZp/U4BZsLF5Brm42aa21bVfKcjzsWQHuJOfIGbHg==',key_name='tempest-TestMinimumBasicScenario-1698854069',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9ead44a7da0640cbb2cf8dece0ea4f40',ramdisk_id='',reservation_id='r-2uysnlk0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='d5ab5add-8fb7-4436-88e9-0cb945ddc863',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-339882477',owner_user_name='tempest-TestMinimumBasicScenario-339882477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:53:57Z,user_data=None,user_id='6cae5a1734d24ac8aebc233dd31d3084',uuid=f3e0bc01-1cf2-4ff9-bec6-12a37e44171c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ec8edd23-eb04-4e01-874f-7a5ad305eacc", "address": "fa:16:3e:62:75:ec", "network": {"id": "89cc600b-891d-4913-9f39-935f5c5bce86", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1522370995-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9ead44a7da0640cbb2cf8dece0ea4f40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapec8edd23-eb", "ovs_interfaceid": "ec8edd23-eb04-4e01-874f-7a5ad305eacc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70954) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 10:53:59 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Converting VIF {"id": "ec8edd23-eb04-4e01-874f-7a5ad305eacc", "address": "fa:16:3e:62:75:ec", "network": {"id": "89cc600b-891d-4913-9f39-935f5c5bce86", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1522370995-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9ead44a7da0640cbb2cf8dece0ea4f40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapec8edd23-eb", "ovs_interfaceid": "ec8edd23-eb04-4e01-874f-7a5ad305eacc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:53:59 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:75:ec,bridge_name='br-int',has_traffic_filtering=True,id=ec8edd23-eb04-4e01-874f-7a5ad305eacc,network=Network(89cc600b-891d-4913-9f39-935f5c5bce86),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec8edd23-eb') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:53:59 user nova-compute[70954]: DEBUG nova.objects.instance [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Lazy-loading 'pci_devices' on Instance uuid f3e0bc01-1cf2-4ff9-bec6-12a37e44171c {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:53:59 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] End _get_guest_xml xml= Apr 21 10:53:59 user nova-compute[70954]: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c Apr 21 10:53:59 user nova-compute[70954]: instance-00000011 Apr 21 10:53:59 user nova-compute[70954]: 131072 Apr 21 10:53:59 user nova-compute[70954]: 1 Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: tempest-TestMinimumBasicScenario-server-843445589 Apr 21 10:53:59 user nova-compute[70954]: 2023-04-21 10:53:59 Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: 128 Apr 21 10:53:59 user nova-compute[70954]: 1 Apr 21 10:53:59 user nova-compute[70954]: 0 Apr 21 10:53:59 user nova-compute[70954]: 0 Apr 21 10:53:59 user nova-compute[70954]: 1 Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: tempest-TestMinimumBasicScenario-339882477-project-member Apr 21 10:53:59 user nova-compute[70954]: tempest-TestMinimumBasicScenario-339882477 Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: OpenStack Foundation Apr 21 10:53:59 user nova-compute[70954]: OpenStack Nova Apr 21 10:53:59 user nova-compute[70954]: 0.0.0 Apr 21 10:53:59 user nova-compute[70954]: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c Apr 21 10:53:59 user nova-compute[70954]: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c Apr 21 10:53:59 user nova-compute[70954]: Virtual Machine Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: hvm Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Nehalem Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: /dev/urandom Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: Apr 21 10:53:59 user nova-compute[70954]: {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 10:53:59 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:53:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-843445589',display_name='tempest-TestMinimumBasicScenario-server-843445589',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-843445589',id=17,image_ref='d5ab5add-8fb7-4436-88e9-0cb945ddc863',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG5kEQk6hWARzkdAxYq+PejmYD/iouSlgR4Vlro63NFT4BV2SLZZTcJkpc9dGXv7UnIQQu9gOJSnEV3QeSQLSLjalHZp/U4BZsLF5Brm42aa21bVfKcjzsWQHuJOfIGbHg==',key_name='tempest-TestMinimumBasicScenario-1698854069',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9ead44a7da0640cbb2cf8dece0ea4f40',ramdisk_id='',reservation_id='r-2uysnlk0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='d5ab5add-8fb7-4436-88e9-0cb945ddc863',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-339882477',owner_user_name='tempest-TestMinimumBasicScenario-339882477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:53:57Z,user_data=None,user_id='6cae5a1734d24ac8aebc233dd31d3084',uuid=f3e0bc01-1cf2-4ff9-bec6-12a37e44171c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ec8edd23-eb04-4e01-874f-7a5ad305eacc", "address": "fa:16:3e:62:75:ec", "network": {"id": "89cc600b-891d-4913-9f39-935f5c5bce86", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1522370995-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9ead44a7da0640cbb2cf8dece0ea4f40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapec8edd23-eb", "ovs_interfaceid": "ec8edd23-eb04-4e01-874f-7a5ad305eacc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 10:53:59 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Converting VIF {"id": "ec8edd23-eb04-4e01-874f-7a5ad305eacc", "address": "fa:16:3e:62:75:ec", "network": {"id": "89cc600b-891d-4913-9f39-935f5c5bce86", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1522370995-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9ead44a7da0640cbb2cf8dece0ea4f40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapec8edd23-eb", "ovs_interfaceid": "ec8edd23-eb04-4e01-874f-7a5ad305eacc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:53:59 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:75:ec,bridge_name='br-int',has_traffic_filtering=True,id=ec8edd23-eb04-4e01-874f-7a5ad305eacc,network=Network(89cc600b-891d-4913-9f39-935f5c5bce86),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec8edd23-eb') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:53:59 user nova-compute[70954]: DEBUG os_vif [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:75:ec,bridge_name='br-int',has_traffic_filtering=True,id=ec8edd23-eb04-4e01-874f-7a5ad305eacc,network=Network(89cc600b-891d-4913-9f39-935f5c5bce86),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec8edd23-eb') {{(pid=70954) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 10:53:59 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:59 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:53:59 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 10:53:59 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:59 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapec8edd23-eb, may_exist=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:53:59 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapec8edd23-eb, col_values=(('external_ids', {'iface-id': 'ec8edd23-eb04-4e01-874f-7a5ad305eacc', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:62:75:ec', 'vm-uuid': 'f3e0bc01-1cf2-4ff9-bec6-12a37e44171c'}),)) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:53:59 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:59 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:53:59 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:53:59 user nova-compute[70954]: INFO os_vif [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:75:ec,bridge_name='br-int',has_traffic_filtering=True,id=ec8edd23-eb04-4e01-874f-7a5ad305eacc,network=Network(89cc600b-891d-4913-9f39-935f5c5bce86),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec8edd23-eb') Apr 21 10:53:59 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] No BDM found with device name vda, not building metadata. {{(pid=70954) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 10:53:59 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] No VIF found with MAC fa:16:3e:62:75:ec, not building metadata {{(pid=70954) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 10:53:59 user nova-compute[70954]: INFO nova.scheduler.client.report [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Deleted allocations for instance a5418b3c-8587-4544-b562-fe01a69be3fc Apr 21 10:53:59 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-5ca798db-ac27-47dc-949b-c6831060dc6a tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "a5418b3c-8587-4544-b562-fe01a69be3fc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.650s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:53:59 user nova-compute[70954]: DEBUG nova.network.neutron [req-34de482a-e735-444e-b332-c369dc5cb4db req-0f94b517-e037-4f18-b878-68956960ff18 service nova] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Updated VIF entry in instance network info cache for port ec8edd23-eb04-4e01-874f-7a5ad305eacc. {{(pid=70954) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 10:53:59 user nova-compute[70954]: DEBUG nova.network.neutron [req-34de482a-e735-444e-b332-c369dc5cb4db req-0f94b517-e037-4f18-b878-68956960ff18 service nova] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Updating instance_info_cache with network_info: [{"id": "ec8edd23-eb04-4e01-874f-7a5ad305eacc", "address": "fa:16:3e:62:75:ec", "network": {"id": "89cc600b-891d-4913-9f39-935f5c5bce86", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1522370995-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9ead44a7da0640cbb2cf8dece0ea4f40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapec8edd23-eb", "ovs_interfaceid": "ec8edd23-eb04-4e01-874f-7a5ad305eacc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:53:59 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-34de482a-e735-444e-b332-c369dc5cb4db req-0f94b517-e037-4f18-b878-68956960ff18 service nova] Releasing lock "refresh_cache-f3e0bc01-1cf2-4ff9-bec6-12a37e44171c" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:54:00 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:54:00 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:54:00 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:54:00 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:54:01 user nova-compute[70954]: DEBUG nova.compute.manager [req-9e0b9c43-3407-4bc7-abe3-16c0b89922fc req-8fe0ea82-3672-4546-8926-5d46c3f66afd service nova] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Received event network-vif-plugged-ec8edd23-eb04-4e01-874f-7a5ad305eacc {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:54:01 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-9e0b9c43-3407-4bc7-abe3-16c0b89922fc req-8fe0ea82-3672-4546-8926-5d46c3f66afd service nova] Acquiring lock "f3e0bc01-1cf2-4ff9-bec6-12a37e44171c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:54:01 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-9e0b9c43-3407-4bc7-abe3-16c0b89922fc req-8fe0ea82-3672-4546-8926-5d46c3f66afd service nova] Lock "f3e0bc01-1cf2-4ff9-bec6-12a37e44171c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:54:01 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-9e0b9c43-3407-4bc7-abe3-16c0b89922fc req-8fe0ea82-3672-4546-8926-5d46c3f66afd service nova] Lock "f3e0bc01-1cf2-4ff9-bec6-12a37e44171c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:54:01 user nova-compute[70954]: DEBUG nova.compute.manager [req-9e0b9c43-3407-4bc7-abe3-16c0b89922fc req-8fe0ea82-3672-4546-8926-5d46c3f66afd service nova] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] No waiting events found dispatching network-vif-plugged-ec8edd23-eb04-4e01-874f-7a5ad305eacc {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:54:01 user nova-compute[70954]: WARNING nova.compute.manager [req-9e0b9c43-3407-4bc7-abe3-16c0b89922fc req-8fe0ea82-3672-4546-8926-5d46c3f66afd service nova] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Received unexpected event network-vif-plugged-ec8edd23-eb04-4e01-874f-7a5ad305eacc for instance with vm_state building and task_state spawning. Apr 21 10:54:01 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:54:01 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:54:01 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:54:01 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:54:01 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:54:01 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:54:01 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Starting heal instance info cache {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 10:54:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "refresh_cache-f8609da3-c26d-482a-bc03-017baf4bce22" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:54:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquired lock "refresh_cache-f8609da3-c26d-482a-bc03-017baf4bce22" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:54:02 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Forcefully refreshing network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 21 10:54:02 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Updating instance_info_cache with network_info: [{"id": "f210779b-302b-4a17-8b57-07837ea54e12", "address": "fa:16:3e:c3:c6:d1", "network": {"id": "ba8e9ff2-e562-462e-a2fa-0d7f643da26c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-83296950-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.39", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "648163a728fc4b28b85a24e9198d356b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf210779b-30", "ovs_interfaceid": "f210779b-302b-4a17-8b57-07837ea54e12", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:54:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Releasing lock "refresh_cache-f8609da3-c26d-482a-bc03-017baf4bce22" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:54:02 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Updated the network info_cache for instance {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 21 10:54:02 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:54:02 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Resumed> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:54:02 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] VM Resumed (Lifecycle Event) Apr 21 10:54:02 user nova-compute[70954]: DEBUG nova.compute.manager [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Instance event wait completed in 0 seconds for {{(pid=70954) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 10:54:02 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Guest created on hypervisor {{(pid=70954) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 10:54:03 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Instance spawned successfully. Apr 21 10:54:03 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 10:54:03 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:54:03 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:54:03 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Found default for hw_cdrom_bus of ide {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:54:03 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Found default for hw_disk_bus of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:54:03 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Found default for hw_input_bus of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:54:03 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Found default for hw_pointer_model of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:54:03 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Found default for hw_video_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:54:03 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Found default for hw_vif_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:54:03 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 10:54:03 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Started> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:54:03 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] VM Started (Lifecycle Event) Apr 21 10:54:03 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:54:03 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:54:03 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 10:54:03 user nova-compute[70954]: INFO nova.compute.manager [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Took 6.43 seconds to spawn the instance on the hypervisor. Apr 21 10:54:03 user nova-compute[70954]: DEBUG nova.compute.manager [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:54:03 user nova-compute[70954]: DEBUG nova.compute.manager [req-dd2e9920-9b17-4f35-8cc1-c50e3fe6b55a req-1afb10cb-8392-4660-827d-56d2879f7168 service nova] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Received event network-vif-plugged-ec8edd23-eb04-4e01-874f-7a5ad305eacc {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:54:03 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-dd2e9920-9b17-4f35-8cc1-c50e3fe6b55a req-1afb10cb-8392-4660-827d-56d2879f7168 service nova] Acquiring lock "f3e0bc01-1cf2-4ff9-bec6-12a37e44171c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:54:03 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-dd2e9920-9b17-4f35-8cc1-c50e3fe6b55a req-1afb10cb-8392-4660-827d-56d2879f7168 service nova] Lock "f3e0bc01-1cf2-4ff9-bec6-12a37e44171c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:54:03 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-dd2e9920-9b17-4f35-8cc1-c50e3fe6b55a req-1afb10cb-8392-4660-827d-56d2879f7168 service nova] Lock "f3e0bc01-1cf2-4ff9-bec6-12a37e44171c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:54:03 user nova-compute[70954]: DEBUG nova.compute.manager [req-dd2e9920-9b17-4f35-8cc1-c50e3fe6b55a req-1afb10cb-8392-4660-827d-56d2879f7168 service nova] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] No waiting events found dispatching network-vif-plugged-ec8edd23-eb04-4e01-874f-7a5ad305eacc {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:54:03 user nova-compute[70954]: WARNING nova.compute.manager [req-dd2e9920-9b17-4f35-8cc1-c50e3fe6b55a req-1afb10cb-8392-4660-827d-56d2879f7168 service nova] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Received unexpected event network-vif-plugged-ec8edd23-eb04-4e01-874f-7a5ad305eacc for instance with vm_state building and task_state spawning. Apr 21 10:54:03 user nova-compute[70954]: INFO nova.compute.manager [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Took 7.04 seconds to build instance. Apr 21 10:54:03 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-23ef1f3c-39be-4342-8242-beff1fe5793e tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Lock "f3e0bc01-1cf2-4ff9-bec6-12a37e44171c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.162s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:54:03 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "f3e0bc01-1cf2-4ff9-bec6-12a37e44171c" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 4.892s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:54:03 user nova-compute[70954]: INFO nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 10:54:03 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "f3e0bc01-1cf2-4ff9-bec6-12a37e44171c" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:54:03 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:54:03 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:54:04 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:54:04 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:54:04 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager.update_available_resource {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:54:04 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:54:04 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:54:04 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:54:04 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Auditing locally available compute resources for user (node: user) {{(pid=70954) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 10:54:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:54:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk --force-share --output=json" returned: 0 in 0.146s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:54:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:54:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk --force-share --output=json" returned: 0 in 0.144s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:54:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:54:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:54:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:54:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:54:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/14bd5401-4cc1-4827-8d4a-fd1358bb9c6b/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:54:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/14bd5401-4cc1-4827-8d4a-fd1358bb9c6b/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:54:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/14bd5401-4cc1-4827-8d4a-fd1358bb9c6b/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:54:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/14bd5401-4cc1-4827-8d4a-fd1358bb9c6b/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:54:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d28e1e38-3ed5-468e-b672-8b94a909820c/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:54:06 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d28e1e38-3ed5-468e-b672-8b94a909820c/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:54:06 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d28e1e38-3ed5-468e-b672-8b94a909820c/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:54:06 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d28e1e38-3ed5-468e-b672-8b94a909820c/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:54:06 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f3e0bc01-1cf2-4ff9-bec6-12a37e44171c/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:54:06 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f3e0bc01-1cf2-4ff9-bec6-12a37e44171c/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:54:06 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f3e0bc01-1cf2-4ff9-bec6-12a37e44171c/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:54:06 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f3e0bc01-1cf2-4ff9-bec6-12a37e44171c/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:54:06 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f4dda568-8f3b-40eb-aff3-64d3e759c310/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:54:06 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f4dda568-8f3b-40eb-aff3-64d3e759c310/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:54:06 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f4dda568-8f3b-40eb-aff3-64d3e759c310/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:54:06 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f4dda568-8f3b-40eb-aff3-64d3e759c310/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:54:07 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:54:07 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:54:07 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Hypervisor/Node resource view: name=user free_ram=8562MB free_disk=26.40807342529297GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70954) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 10:54:07 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:54:07 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:54:07 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 84b55fc0-e748-4c05-97ad-a6994c0487d2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:54:07 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance f8609da3-c26d-482a-bc03-017baf4bce22 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:54:07 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance f4dda568-8f3b-40eb-aff3-64d3e759c310 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:54:07 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:54:07 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance d28e1e38-3ed5-468e-b672-8b94a909820c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:54:07 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance f3e0bc01-1cf2-4ff9-bec6-12a37e44171c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:54:07 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Total usable vcpus: 12, total allocated vcpus: 6 {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 10:54:07 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Final resource view: name=user phys_ram=16023MB used_ram=1280MB phys_disk=40GB used_disk=6GB total_vcpus=12 used_vcpus=6 pci_stats=[] {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 10:54:07 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Refreshing inventories for resource provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 21 10:54:07 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Updating ProviderTree inventory for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 21 10:54:07 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Updating inventory in ProviderTree for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 21 10:54:07 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Refreshing aggregate associations for resource provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97, aggregates: None {{(pid=70954) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 21 10:54:07 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Refreshing trait associations for resource provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97, traits: COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,HW_CPU_X86_SSE,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_USB,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT {{(pid=70954) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 21 10:54:07 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:54:07 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:54:07 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Compute_service record updated for user:user {{(pid=70954) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 10:54:07 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.632s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:54:07 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:54:08 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:54:08 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:54:09 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:54:12 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-8e0e8137-27be-48a9-abc0-96f50edd3b07 tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Acquiring lock "14bd5401-4cc1-4827-8d4a-fd1358bb9c6b" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:54:12 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-8e0e8137-27be-48a9-abc0-96f50edd3b07 tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Lock "14bd5401-4cc1-4827-8d4a-fd1358bb9c6b" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:54:12 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-8e0e8137-27be-48a9-abc0-96f50edd3b07 tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Acquiring lock "14bd5401-4cc1-4827-8d4a-fd1358bb9c6b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:54:12 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-8e0e8137-27be-48a9-abc0-96f50edd3b07 tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Lock "14bd5401-4cc1-4827-8d4a-fd1358bb9c6b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:54:12 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-8e0e8137-27be-48a9-abc0-96f50edd3b07 tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Lock "14bd5401-4cc1-4827-8d4a-fd1358bb9c6b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:54:12 user nova-compute[70954]: INFO nova.compute.manager [None req-8e0e8137-27be-48a9-abc0-96f50edd3b07 tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Terminating instance Apr 21 10:54:12 user nova-compute[70954]: DEBUG nova.compute.manager [None req-8e0e8137-27be-48a9-abc0-96f50edd3b07 tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Start destroying the instance on the hypervisor. {{(pid=70954) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 10:54:12 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:54:12 user nova-compute[70954]: DEBUG nova.compute.manager [req-36bbf5cc-20e4-4187-ab46-fb57f74ca030 req-d0557899-0524-4086-8f07-28042605c882 service nova] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Received event network-vif-unplugged-d8b0279f-95ca-4143-8f78-c6faf74a3620 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:54:12 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-36bbf5cc-20e4-4187-ab46-fb57f74ca030 req-d0557899-0524-4086-8f07-28042605c882 service nova] Acquiring lock "14bd5401-4cc1-4827-8d4a-fd1358bb9c6b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:54:12 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-36bbf5cc-20e4-4187-ab46-fb57f74ca030 req-d0557899-0524-4086-8f07-28042605c882 service nova] Lock "14bd5401-4cc1-4827-8d4a-fd1358bb9c6b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:54:12 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-36bbf5cc-20e4-4187-ab46-fb57f74ca030 req-d0557899-0524-4086-8f07-28042605c882 service nova] Lock "14bd5401-4cc1-4827-8d4a-fd1358bb9c6b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:54:12 user nova-compute[70954]: DEBUG nova.compute.manager [req-36bbf5cc-20e4-4187-ab46-fb57f74ca030 req-d0557899-0524-4086-8f07-28042605c882 service nova] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] No waiting events found dispatching network-vif-unplugged-d8b0279f-95ca-4143-8f78-c6faf74a3620 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:54:12 user nova-compute[70954]: DEBUG nova.compute.manager [req-36bbf5cc-20e4-4187-ab46-fb57f74ca030 req-d0557899-0524-4086-8f07-28042605c882 service nova] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Received event network-vif-unplugged-d8b0279f-95ca-4143-8f78-c6faf74a3620 for instance with task_state deleting. {{(pid=70954) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 10:54:12 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:54:12 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:54:12 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:54:12 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Instance destroyed successfully. Apr 21 10:54:12 user nova-compute[70954]: DEBUG nova.objects.instance [None req-8e0e8137-27be-48a9-abc0-96f50edd3b07 tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Lazy-loading 'resources' on Instance uuid 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:54:12 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-8e0e8137-27be-48a9-abc0-96f50edd3b07 tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:52:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-1210829464',display_name='tempest-VolumesActionsTest-instance-1210829464',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesactionstest-instance-1210829464',id=14,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-21T10:52:26Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='8eaf2efa4ddc4d8fbc5ec14e86d93c53',ramdisk_id='',reservation_id='r-z7mh3xce',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-VolumesActionsTest-301211957',owner_user_name='tempest-VolumesActionsTest-301211957-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T10:52:26Z,user_data=None,user_id='dc80162b39bc4ff2a71a8ff4d34979c3',uuid=14bd5401-4cc1-4827-8d4a-fd1358bb9c6b,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d8b0279f-95ca-4143-8f78-c6faf74a3620", "address": "fa:16:3e:63:4e:67", "network": {"id": "3fe4dad8-0e5a-4737-9c24-dc77d7b275ff", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1133574237-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "8eaf2efa4ddc4d8fbc5ec14e86d93c53", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8b0279f-95", "ovs_interfaceid": "d8b0279f-95ca-4143-8f78-c6faf74a3620", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 10:54:12 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-8e0e8137-27be-48a9-abc0-96f50edd3b07 tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Converting VIF {"id": "d8b0279f-95ca-4143-8f78-c6faf74a3620", "address": "fa:16:3e:63:4e:67", "network": {"id": "3fe4dad8-0e5a-4737-9c24-dc77d7b275ff", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1133574237-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "8eaf2efa4ddc4d8fbc5ec14e86d93c53", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd8b0279f-95", "ovs_interfaceid": "d8b0279f-95ca-4143-8f78-c6faf74a3620", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:54:12 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-8e0e8137-27be-48a9-abc0-96f50edd3b07 tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:63:4e:67,bridge_name='br-int',has_traffic_filtering=True,id=d8b0279f-95ca-4143-8f78-c6faf74a3620,network=Network(3fe4dad8-0e5a-4737-9c24-dc77d7b275ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8b0279f-95') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:54:12 user nova-compute[70954]: DEBUG os_vif [None req-8e0e8137-27be-48a9-abc0-96f50edd3b07 tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:63:4e:67,bridge_name='br-int',has_traffic_filtering=True,id=d8b0279f-95ca-4143-8f78-c6faf74a3620,network=Network(3fe4dad8-0e5a-4737-9c24-dc77d7b275ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8b0279f-95') {{(pid=70954) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 10:54:12 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:54:12 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd8b0279f-95, bridge=br-int, if_exists=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:54:12 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:54:12 user nova-compute[70954]: INFO os_vif [None req-8e0e8137-27be-48a9-abc0-96f50edd3b07 tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:63:4e:67,bridge_name='br-int',has_traffic_filtering=True,id=d8b0279f-95ca-4143-8f78-c6faf74a3620,network=Network(3fe4dad8-0e5a-4737-9c24-dc77d7b275ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd8b0279f-95') Apr 21 10:54:12 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-8e0e8137-27be-48a9-abc0-96f50edd3b07 tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Deleting instance files /opt/stack/data/nova/instances/14bd5401-4cc1-4827-8d4a-fd1358bb9c6b_del Apr 21 10:54:12 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-8e0e8137-27be-48a9-abc0-96f50edd3b07 tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Deletion of /opt/stack/data/nova/instances/14bd5401-4cc1-4827-8d4a-fd1358bb9c6b_del complete Apr 21 10:54:12 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:54:12 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70954) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 10:54:12 user nova-compute[70954]: INFO nova.compute.manager [None req-8e0e8137-27be-48a9-abc0-96f50edd3b07 tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Took 0.65 seconds to destroy the instance on the hypervisor. Apr 21 10:54:12 user nova-compute[70954]: DEBUG oslo.service.loopingcall [None req-8e0e8137-27be-48a9-abc0-96f50edd3b07 tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70954) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 10:54:12 user nova-compute[70954]: DEBUG nova.compute.manager [-] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Deallocating network for instance {{(pid=70954) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 10:54:12 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] deallocate_for_instance() {{(pid=70954) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 10:54:13 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Updating instance_info_cache with network_info: [] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:54:13 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Took 0.62 seconds to deallocate network for instance. Apr 21 10:54:13 user nova-compute[70954]: DEBUG nova.compute.manager [req-baa6b527-bdde-4ba4-83b7-1a4dc673730d req-96cf1493-3961-4bf6-a31f-ebd96f8494f1 service nova] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Received event network-vif-deleted-d8b0279f-95ca-4143-8f78-c6faf74a3620 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:54:13 user nova-compute[70954]: INFO nova.compute.manager [req-baa6b527-bdde-4ba4-83b7-1a4dc673730d req-96cf1493-3961-4bf6-a31f-ebd96f8494f1 service nova] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Neutron deleted interface d8b0279f-95ca-4143-8f78-c6faf74a3620; detaching it from the instance and deleting it from the info cache Apr 21 10:54:13 user nova-compute[70954]: DEBUG nova.network.neutron [req-baa6b527-bdde-4ba4-83b7-1a4dc673730d req-96cf1493-3961-4bf6-a31f-ebd96f8494f1 service nova] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Updating instance_info_cache with network_info: [] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:54:13 user nova-compute[70954]: DEBUG nova.compute.manager [req-baa6b527-bdde-4ba4-83b7-1a4dc673730d req-96cf1493-3961-4bf6-a31f-ebd96f8494f1 service nova] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Detach interface failed, port_id=d8b0279f-95ca-4143-8f78-c6faf74a3620, reason: Instance 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b could not be found. {{(pid=70954) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 21 10:54:13 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-8e0e8137-27be-48a9-abc0-96f50edd3b07 tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:54:13 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-8e0e8137-27be-48a9-abc0-96f50edd3b07 tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:54:13 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-8e0e8137-27be-48a9-abc0-96f50edd3b07 tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:54:13 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-8e0e8137-27be-48a9-abc0-96f50edd3b07 tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:54:13 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-8e0e8137-27be-48a9-abc0-96f50edd3b07 tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.232s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:54:13 user nova-compute[70954]: INFO nova.scheduler.client.report [None req-8e0e8137-27be-48a9-abc0-96f50edd3b07 tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Deleted allocations for instance 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b Apr 21 10:54:13 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-8e0e8137-27be-48a9-abc0-96f50edd3b07 tempest-VolumesActionsTest-301211957 tempest-VolumesActionsTest-301211957-project-member] Lock "14bd5401-4cc1-4827-8d4a-fd1358bb9c6b" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.702s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:54:14 user nova-compute[70954]: DEBUG nova.compute.manager [req-1132c827-14a9-45f4-8162-bf337b66e876 req-13b7922f-af94-4d74-a256-855ffab08af8 service nova] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Received event network-vif-plugged-d8b0279f-95ca-4143-8f78-c6faf74a3620 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:54:14 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-1132c827-14a9-45f4-8162-bf337b66e876 req-13b7922f-af94-4d74-a256-855ffab08af8 service nova] Acquiring lock "14bd5401-4cc1-4827-8d4a-fd1358bb9c6b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:54:14 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-1132c827-14a9-45f4-8162-bf337b66e876 req-13b7922f-af94-4d74-a256-855ffab08af8 service nova] Lock "14bd5401-4cc1-4827-8d4a-fd1358bb9c6b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:54:14 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-1132c827-14a9-45f4-8162-bf337b66e876 req-13b7922f-af94-4d74-a256-855ffab08af8 service nova] Lock "14bd5401-4cc1-4827-8d4a-fd1358bb9c6b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:54:14 user nova-compute[70954]: DEBUG nova.compute.manager [req-1132c827-14a9-45f4-8162-bf337b66e876 req-13b7922f-af94-4d74-a256-855ffab08af8 service nova] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] No waiting events found dispatching network-vif-plugged-d8b0279f-95ca-4143-8f78-c6faf74a3620 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:54:14 user nova-compute[70954]: WARNING nova.compute.manager [req-1132c827-14a9-45f4-8162-bf337b66e876 req-13b7922f-af94-4d74-a256-855ffab08af8 service nova] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Received unexpected event network-vif-plugged-d8b0279f-95ca-4143-8f78-c6faf74a3620 for instance with vm_state deleted and task_state None. Apr 21 10:54:17 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:54:17 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:54:22 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:54:22 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:54:27 user nova-compute[70954]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:54:27 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] VM Stopped (Lifecycle Event) Apr 21 10:54:27 user nova-compute[70954]: DEBUG nova.compute.manager [None req-0bbe731c-8b97-4898-b244-c14dcaa91bfc None None] [instance: 14bd5401-4cc1-4827-8d4a-fd1358bb9c6b] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:54:27 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:54:32 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:54:32 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:54:32 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=70954) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 21 10:54:32 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 10:54:32 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 10:54:32 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:54:32 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:54:37 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:54:42 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-39c0a007-3e3c-47c0-8aa1-03dd709572b6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Acquiring lock "f4dda568-8f3b-40eb-aff3-64d3e759c310" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:54:42 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-39c0a007-3e3c-47c0-8aa1-03dd709572b6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "f4dda568-8f3b-40eb-aff3-64d3e759c310" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:54:42 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-39c0a007-3e3c-47c0-8aa1-03dd709572b6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Acquiring lock "f4dda568-8f3b-40eb-aff3-64d3e759c310-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:54:42 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-39c0a007-3e3c-47c0-8aa1-03dd709572b6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "f4dda568-8f3b-40eb-aff3-64d3e759c310-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:54:42 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-39c0a007-3e3c-47c0-8aa1-03dd709572b6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "f4dda568-8f3b-40eb-aff3-64d3e759c310-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:54:42 user nova-compute[70954]: INFO nova.compute.manager [None req-39c0a007-3e3c-47c0-8aa1-03dd709572b6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Terminating instance Apr 21 10:54:42 user nova-compute[70954]: DEBUG nova.compute.manager [None req-39c0a007-3e3c-47c0-8aa1-03dd709572b6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Start destroying the instance on the hypervisor. {{(pid=70954) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 10:54:42 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:54:42 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:54:42 user nova-compute[70954]: DEBUG nova.compute.manager [req-974f87fd-df78-4a52-877a-0cc9ce523925 req-b760edb7-6b6f-4353-8127-398045ed324f service nova] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Received event network-vif-unplugged-0e9676a1-1652-48fd-affd-355632de3ca2 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:54:42 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-974f87fd-df78-4a52-877a-0cc9ce523925 req-b760edb7-6b6f-4353-8127-398045ed324f service nova] Acquiring lock "f4dda568-8f3b-40eb-aff3-64d3e759c310-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:54:42 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-974f87fd-df78-4a52-877a-0cc9ce523925 req-b760edb7-6b6f-4353-8127-398045ed324f service nova] Lock "f4dda568-8f3b-40eb-aff3-64d3e759c310-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:54:42 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-974f87fd-df78-4a52-877a-0cc9ce523925 req-b760edb7-6b6f-4353-8127-398045ed324f service nova] Lock "f4dda568-8f3b-40eb-aff3-64d3e759c310-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:54:42 user nova-compute[70954]: DEBUG nova.compute.manager [req-974f87fd-df78-4a52-877a-0cc9ce523925 req-b760edb7-6b6f-4353-8127-398045ed324f service nova] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] No waiting events found dispatching network-vif-unplugged-0e9676a1-1652-48fd-affd-355632de3ca2 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:54:42 user nova-compute[70954]: DEBUG nova.compute.manager [req-974f87fd-df78-4a52-877a-0cc9ce523925 req-b760edb7-6b6f-4353-8127-398045ed324f service nova] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Received event network-vif-unplugged-0e9676a1-1652-48fd-affd-355632de3ca2 for instance with task_state deleting. {{(pid=70954) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 10:54:43 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:54:43 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:54:43 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:54:43 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Instance destroyed successfully. Apr 21 10:54:43 user nova-compute[70954]: DEBUG nova.objects.instance [None req-39c0a007-3e3c-47c0-8aa1-03dd709572b6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lazy-loading 'resources' on Instance uuid f4dda568-8f3b-40eb-aff3-64d3e759c310 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:54:43 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-39c0a007-3e3c-47c0-8aa1-03dd709572b6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:50:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-2043125688',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-2043125688',id=12,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-21T10:50:59Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='aad84a0e014f47ddaeaddc88bf16b0a8',ramdisk_id='',reservation_id='r-5710yc5p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-1980957418',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T10:52:50Z,user_data=None,user_id='54c67d90b6014d9ea24ef2552006bc04',uuid=f4dda568-8f3b-40eb-aff3-64d3e759c310,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0e9676a1-1652-48fd-affd-355632de3ca2", "address": "fa:16:3e:eb:77:eb", "network": {"id": "cfb4de90-44ea-486a-b5c4-c3b1111aa2bd", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1667019531-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "aad84a0e014f47ddaeaddc88bf16b0a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e9676a1-16", "ovs_interfaceid": "0e9676a1-1652-48fd-affd-355632de3ca2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 10:54:43 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-39c0a007-3e3c-47c0-8aa1-03dd709572b6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Converting VIF {"id": "0e9676a1-1652-48fd-affd-355632de3ca2", "address": "fa:16:3e:eb:77:eb", "network": {"id": "cfb4de90-44ea-486a-b5c4-c3b1111aa2bd", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1667019531-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "aad84a0e014f47ddaeaddc88bf16b0a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e9676a1-16", "ovs_interfaceid": "0e9676a1-1652-48fd-affd-355632de3ca2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:54:43 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-39c0a007-3e3c-47c0-8aa1-03dd709572b6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:77:eb,bridge_name='br-int',has_traffic_filtering=True,id=0e9676a1-1652-48fd-affd-355632de3ca2,network=Network(cfb4de90-44ea-486a-b5c4-c3b1111aa2bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e9676a1-16') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:54:43 user nova-compute[70954]: DEBUG os_vif [None req-39c0a007-3e3c-47c0-8aa1-03dd709572b6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:77:eb,bridge_name='br-int',has_traffic_filtering=True,id=0e9676a1-1652-48fd-affd-355632de3ca2,network=Network(cfb4de90-44ea-486a-b5c4-c3b1111aa2bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e9676a1-16') {{(pid=70954) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 10:54:43 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:54:43 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0e9676a1-16, bridge=br-int, if_exists=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:54:43 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:54:43 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:54:43 user nova-compute[70954]: INFO os_vif [None req-39c0a007-3e3c-47c0-8aa1-03dd709572b6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:77:eb,bridge_name='br-int',has_traffic_filtering=True,id=0e9676a1-1652-48fd-affd-355632de3ca2,network=Network(cfb4de90-44ea-486a-b5c4-c3b1111aa2bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e9676a1-16') Apr 21 10:54:43 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-39c0a007-3e3c-47c0-8aa1-03dd709572b6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Deleting instance files /opt/stack/data/nova/instances/f4dda568-8f3b-40eb-aff3-64d3e759c310_del Apr 21 10:54:43 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-39c0a007-3e3c-47c0-8aa1-03dd709572b6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Deletion of /opt/stack/data/nova/instances/f4dda568-8f3b-40eb-aff3-64d3e759c310_del complete Apr 21 10:54:43 user nova-compute[70954]: INFO nova.compute.manager [None req-39c0a007-3e3c-47c0-8aa1-03dd709572b6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Took 0.66 seconds to destroy the instance on the hypervisor. Apr 21 10:54:43 user nova-compute[70954]: DEBUG oslo.service.loopingcall [None req-39c0a007-3e3c-47c0-8aa1-03dd709572b6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70954) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 10:54:43 user nova-compute[70954]: DEBUG nova.compute.manager [-] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Deallocating network for instance {{(pid=70954) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 10:54:43 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] deallocate_for_instance() {{(pid=70954) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 10:54:43 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Updating instance_info_cache with network_info: [] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:54:43 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Took 0.46 seconds to deallocate network for instance. Apr 21 10:54:43 user nova-compute[70954]: DEBUG nova.compute.manager [req-7359489b-bf66-45c6-b0b8-f3a774751afa req-d98896af-bc39-4e27-b942-8485d78a5db2 service nova] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Received event network-vif-deleted-0e9676a1-1652-48fd-affd-355632de3ca2 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:54:43 user nova-compute[70954]: INFO nova.compute.manager [req-7359489b-bf66-45c6-b0b8-f3a774751afa req-d98896af-bc39-4e27-b942-8485d78a5db2 service nova] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Neutron deleted interface 0e9676a1-1652-48fd-affd-355632de3ca2; detaching it from the instance and deleting it from the info cache Apr 21 10:54:43 user nova-compute[70954]: DEBUG nova.network.neutron [req-7359489b-bf66-45c6-b0b8-f3a774751afa req-d98896af-bc39-4e27-b942-8485d78a5db2 service nova] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Updating instance_info_cache with network_info: [] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:54:43 user nova-compute[70954]: DEBUG nova.compute.manager [req-7359489b-bf66-45c6-b0b8-f3a774751afa req-d98896af-bc39-4e27-b942-8485d78a5db2 service nova] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Detach interface failed, port_id=0e9676a1-1652-48fd-affd-355632de3ca2, reason: Instance f4dda568-8f3b-40eb-aff3-64d3e759c310 could not be found. {{(pid=70954) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 21 10:54:43 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-39c0a007-3e3c-47c0-8aa1-03dd709572b6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:54:43 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-39c0a007-3e3c-47c0-8aa1-03dd709572b6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:54:44 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-39c0a007-3e3c-47c0-8aa1-03dd709572b6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:54:44 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-39c0a007-3e3c-47c0-8aa1-03dd709572b6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:54:44 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-39c0a007-3e3c-47c0-8aa1-03dd709572b6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.228s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:54:44 user nova-compute[70954]: INFO nova.scheduler.client.report [None req-39c0a007-3e3c-47c0-8aa1-03dd709572b6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Deleted allocations for instance f4dda568-8f3b-40eb-aff3-64d3e759c310 Apr 21 10:54:44 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-39c0a007-3e3c-47c0-8aa1-03dd709572b6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "f4dda568-8f3b-40eb-aff3-64d3e759c310" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.542s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:54:45 user nova-compute[70954]: DEBUG nova.compute.manager [req-445e6ce2-f64f-475a-8d68-667a35e91f63 req-2aa69d1f-b7b0-4544-afcf-1aae4af0034d service nova] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Received event network-vif-plugged-0e9676a1-1652-48fd-affd-355632de3ca2 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:54:45 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-445e6ce2-f64f-475a-8d68-667a35e91f63 req-2aa69d1f-b7b0-4544-afcf-1aae4af0034d service nova] Acquiring lock "f4dda568-8f3b-40eb-aff3-64d3e759c310-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:54:45 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-445e6ce2-f64f-475a-8d68-667a35e91f63 req-2aa69d1f-b7b0-4544-afcf-1aae4af0034d service nova] Lock "f4dda568-8f3b-40eb-aff3-64d3e759c310-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:54:45 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-445e6ce2-f64f-475a-8d68-667a35e91f63 req-2aa69d1f-b7b0-4544-afcf-1aae4af0034d service nova] Lock "f4dda568-8f3b-40eb-aff3-64d3e759c310-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:54:45 user nova-compute[70954]: DEBUG nova.compute.manager [req-445e6ce2-f64f-475a-8d68-667a35e91f63 req-2aa69d1f-b7b0-4544-afcf-1aae4af0034d service nova] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] No waiting events found dispatching network-vif-plugged-0e9676a1-1652-48fd-affd-355632de3ca2 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:54:45 user nova-compute[70954]: WARNING nova.compute.manager [req-445e6ce2-f64f-475a-8d68-667a35e91f63 req-2aa69d1f-b7b0-4544-afcf-1aae4af0034d service nova] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Received unexpected event network-vif-plugged-0e9676a1-1652-48fd-affd-355632de3ca2 for instance with vm_state deleted and task_state None. Apr 21 10:54:47 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:54:48 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:54:53 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:54:56 user nova-compute[70954]: DEBUG nova.compute.manager [req-23a23dc0-bd3a-419e-a9da-1737ecfe7ede req-7118d547-e561-428e-9d26-46153da13aff service nova] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Received event network-changed-92fab70c-ebc3-4119-8fa9-874304b51cb5 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:54:56 user nova-compute[70954]: DEBUG nova.compute.manager [req-23a23dc0-bd3a-419e-a9da-1737ecfe7ede req-7118d547-e561-428e-9d26-46153da13aff service nova] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Refreshing instance network info cache due to event network-changed-92fab70c-ebc3-4119-8fa9-874304b51cb5. {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 10:54:56 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-23a23dc0-bd3a-419e-a9da-1737ecfe7ede req-7118d547-e561-428e-9d26-46153da13aff service nova] Acquiring lock "refresh_cache-d28e1e38-3ed5-468e-b672-8b94a909820c" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:54:56 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-23a23dc0-bd3a-419e-a9da-1737ecfe7ede req-7118d547-e561-428e-9d26-46153da13aff service nova] Acquired lock "refresh_cache-d28e1e38-3ed5-468e-b672-8b94a909820c" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:54:56 user nova-compute[70954]: DEBUG nova.network.neutron [req-23a23dc0-bd3a-419e-a9da-1737ecfe7ede req-7118d547-e561-428e-9d26-46153da13aff service nova] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Refreshing network info cache for port 92fab70c-ebc3-4119-8fa9-874304b51cb5 {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 10:54:57 user nova-compute[70954]: DEBUG nova.network.neutron [req-23a23dc0-bd3a-419e-a9da-1737ecfe7ede req-7118d547-e561-428e-9d26-46153da13aff service nova] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Updated VIF entry in instance network info cache for port 92fab70c-ebc3-4119-8fa9-874304b51cb5. {{(pid=70954) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 10:54:57 user nova-compute[70954]: DEBUG nova.network.neutron [req-23a23dc0-bd3a-419e-a9da-1737ecfe7ede req-7118d547-e561-428e-9d26-46153da13aff service nova] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Updating instance_info_cache with network_info: [{"id": "92fab70c-ebc3-4119-8fa9-874304b51cb5", "address": "fa:16:3e:13:04:3b", "network": {"id": "e0ccd2d9-69df-40e0-be8e-8328039f1bd0", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-587901453-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "14bc6b0c20204c8287b3523814007856", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap92fab70c-eb", "ovs_interfaceid": "92fab70c-ebc3-4119-8fa9-874304b51cb5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:54:57 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-23a23dc0-bd3a-419e-a9da-1737ecfe7ede req-7118d547-e561-428e-9d26-46153da13aff service nova] Releasing lock "refresh_cache-d28e1e38-3ed5-468e-b672-8b94a909820c" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:54:57 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:54:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-0d042f03-6a2a-419c-b3ab-fdb1f27c6755 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Acquiring lock "d28e1e38-3ed5-468e-b672-8b94a909820c" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:54:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-0d042f03-6a2a-419c-b3ab-fdb1f27c6755 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "d28e1e38-3ed5-468e-b672-8b94a909820c" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:54:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-0d042f03-6a2a-419c-b3ab-fdb1f27c6755 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Acquiring lock "d28e1e38-3ed5-468e-b672-8b94a909820c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:54:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-0d042f03-6a2a-419c-b3ab-fdb1f27c6755 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "d28e1e38-3ed5-468e-b672-8b94a909820c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:54:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-0d042f03-6a2a-419c-b3ab-fdb1f27c6755 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "d28e1e38-3ed5-468e-b672-8b94a909820c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:54:58 user nova-compute[70954]: INFO nova.compute.manager [None req-0d042f03-6a2a-419c-b3ab-fdb1f27c6755 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Terminating instance Apr 21 10:54:58 user nova-compute[70954]: DEBUG nova.compute.manager [None req-0d042f03-6a2a-419c-b3ab-fdb1f27c6755 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Start destroying the instance on the hypervisor. {{(pid=70954) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 10:54:58 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:54:58 user nova-compute[70954]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:54:58 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] VM Stopped (Lifecycle Event) Apr 21 10:54:58 user nova-compute[70954]: DEBUG nova.compute.manager [None req-8f2aeff5-4919-4fa6-a7d5-e43cbb0e80ff None None] [instance: f4dda568-8f3b-40eb-aff3-64d3e759c310] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:54:58 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:54:58 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:54:58 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:54:58 user nova-compute[70954]: DEBUG nova.compute.manager [req-1349ba17-ca8a-4df8-a17b-bccf3fbb354e req-caa4310a-41d1-4ce3-b532-d1d1c478f5bd service nova] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Received event network-vif-unplugged-92fab70c-ebc3-4119-8fa9-874304b51cb5 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:54:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-1349ba17-ca8a-4df8-a17b-bccf3fbb354e req-caa4310a-41d1-4ce3-b532-d1d1c478f5bd service nova] Acquiring lock "d28e1e38-3ed5-468e-b672-8b94a909820c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:54:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-1349ba17-ca8a-4df8-a17b-bccf3fbb354e req-caa4310a-41d1-4ce3-b532-d1d1c478f5bd service nova] Lock "d28e1e38-3ed5-468e-b672-8b94a909820c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:54:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-1349ba17-ca8a-4df8-a17b-bccf3fbb354e req-caa4310a-41d1-4ce3-b532-d1d1c478f5bd service nova] Lock "d28e1e38-3ed5-468e-b672-8b94a909820c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:54:58 user nova-compute[70954]: DEBUG nova.compute.manager [req-1349ba17-ca8a-4df8-a17b-bccf3fbb354e req-caa4310a-41d1-4ce3-b532-d1d1c478f5bd service nova] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] No waiting events found dispatching network-vif-unplugged-92fab70c-ebc3-4119-8fa9-874304b51cb5 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:54:58 user nova-compute[70954]: DEBUG nova.compute.manager [req-1349ba17-ca8a-4df8-a17b-bccf3fbb354e req-caa4310a-41d1-4ce3-b532-d1d1c478f5bd service nova] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Received event network-vif-unplugged-92fab70c-ebc3-4119-8fa9-874304b51cb5 for instance with task_state deleting. {{(pid=70954) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 10:54:58 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Instance destroyed successfully. Apr 21 10:54:58 user nova-compute[70954]: DEBUG nova.objects.instance [None req-0d042f03-6a2a-419c-b3ab-fdb1f27c6755 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lazy-loading 'resources' on Instance uuid d28e1e38-3ed5-468e-b672-8b94a909820c {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:54:58 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-0d042f03-6a2a-419c-b3ab-fdb1f27c6755 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:53:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-2104835370',display_name='tempest-AttachVolumeNegativeTest-server-2104835370',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-2104835370',id=15,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFgvsOkIKKUlR/vc13NXCoQlDdgzN+g6wKzZajODT/6/BJRfaLwQgnuA3mDp4OA0MDn0gizzj2Pl2nG82WPwPorkuBJwINQHac9OchhGaDq4Fh1FeHhPPumdwSQw3oc8bA==',key_name='tempest-keypair-962586907',keypairs=,launch_index=0,launched_at=2023-04-21T10:53:12Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='14bc6b0c20204c8287b3523814007856',ramdisk_id='',reservation_id='r-j20flv9j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeNegativeTest-159654333',owner_user_name='tempest-AttachVolumeNegativeTest-159654333-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T10:53:12Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d7fc66871488428e9842404d885bcfe3',uuid=d28e1e38-3ed5-468e-b672-8b94a909820c,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "92fab70c-ebc3-4119-8fa9-874304b51cb5", "address": "fa:16:3e:13:04:3b", "network": {"id": "e0ccd2d9-69df-40e0-be8e-8328039f1bd0", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-587901453-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "14bc6b0c20204c8287b3523814007856", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap92fab70c-eb", "ovs_interfaceid": "92fab70c-ebc3-4119-8fa9-874304b51cb5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 10:54:58 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-0d042f03-6a2a-419c-b3ab-fdb1f27c6755 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Converting VIF {"id": "92fab70c-ebc3-4119-8fa9-874304b51cb5", "address": "fa:16:3e:13:04:3b", "network": {"id": "e0ccd2d9-69df-40e0-be8e-8328039f1bd0", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-587901453-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "14bc6b0c20204c8287b3523814007856", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap92fab70c-eb", "ovs_interfaceid": "92fab70c-ebc3-4119-8fa9-874304b51cb5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:54:58 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-0d042f03-6a2a-419c-b3ab-fdb1f27c6755 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:13:04:3b,bridge_name='br-int',has_traffic_filtering=True,id=92fab70c-ebc3-4119-8fa9-874304b51cb5,network=Network(e0ccd2d9-69df-40e0-be8e-8328039f1bd0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap92fab70c-eb') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:54:58 user nova-compute[70954]: DEBUG os_vif [None req-0d042f03-6a2a-419c-b3ab-fdb1f27c6755 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:13:04:3b,bridge_name='br-int',has_traffic_filtering=True,id=92fab70c-ebc3-4119-8fa9-874304b51cb5,network=Network(e0ccd2d9-69df-40e0-be8e-8328039f1bd0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap92fab70c-eb') {{(pid=70954) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 10:54:58 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:54:58 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap92fab70c-eb, bridge=br-int, if_exists=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:54:58 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:54:58 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:54:58 user nova-compute[70954]: INFO os_vif [None req-0d042f03-6a2a-419c-b3ab-fdb1f27c6755 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:13:04:3b,bridge_name='br-int',has_traffic_filtering=True,id=92fab70c-ebc3-4119-8fa9-874304b51cb5,network=Network(e0ccd2d9-69df-40e0-be8e-8328039f1bd0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap92fab70c-eb') Apr 21 10:54:58 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-0d042f03-6a2a-419c-b3ab-fdb1f27c6755 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Deleting instance files /opt/stack/data/nova/instances/d28e1e38-3ed5-468e-b672-8b94a909820c_del Apr 21 10:54:58 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-0d042f03-6a2a-419c-b3ab-fdb1f27c6755 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Deletion of /opt/stack/data/nova/instances/d28e1e38-3ed5-468e-b672-8b94a909820c_del complete Apr 21 10:54:59 user nova-compute[70954]: INFO nova.compute.manager [None req-0d042f03-6a2a-419c-b3ab-fdb1f27c6755 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Took 0.84 seconds to destroy the instance on the hypervisor. Apr 21 10:54:59 user nova-compute[70954]: DEBUG oslo.service.loopingcall [None req-0d042f03-6a2a-419c-b3ab-fdb1f27c6755 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70954) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 10:54:59 user nova-compute[70954]: DEBUG nova.compute.manager [-] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Deallocating network for instance {{(pid=70954) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 10:54:59 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] deallocate_for_instance() {{(pid=70954) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 10:55:00 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Updating instance_info_cache with network_info: [] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:55:00 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Took 1.07 seconds to deallocate network for instance. Apr 21 10:55:00 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-0d042f03-6a2a-419c-b3ab-fdb1f27c6755 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:55:00 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-0d042f03-6a2a-419c-b3ab-fdb1f27c6755 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:55:00 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-0d042f03-6a2a-419c-b3ab-fdb1f27c6755 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:55:00 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-0d042f03-6a2a-419c-b3ab-fdb1f27c6755 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:55:00 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-0d042f03-6a2a-419c-b3ab-fdb1f27c6755 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.178s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:55:00 user nova-compute[70954]: INFO nova.scheduler.client.report [None req-0d042f03-6a2a-419c-b3ab-fdb1f27c6755 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Deleted allocations for instance d28e1e38-3ed5-468e-b672-8b94a909820c Apr 21 10:55:00 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-0d042f03-6a2a-419c-b3ab-fdb1f27c6755 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "d28e1e38-3ed5-468e-b672-8b94a909820c" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.255s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:55:00 user nova-compute[70954]: DEBUG nova.compute.manager [req-3af9d7b5-2406-4a31-a516-e30e666ca5dc req-ecbadd18-812d-4623-8ed6-30d514e31c8d service nova] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Received event network-vif-plugged-92fab70c-ebc3-4119-8fa9-874304b51cb5 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:55:00 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-3af9d7b5-2406-4a31-a516-e30e666ca5dc req-ecbadd18-812d-4623-8ed6-30d514e31c8d service nova] Acquiring lock "d28e1e38-3ed5-468e-b672-8b94a909820c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:55:00 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-3af9d7b5-2406-4a31-a516-e30e666ca5dc req-ecbadd18-812d-4623-8ed6-30d514e31c8d service nova] Lock "d28e1e38-3ed5-468e-b672-8b94a909820c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:55:00 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-3af9d7b5-2406-4a31-a516-e30e666ca5dc req-ecbadd18-812d-4623-8ed6-30d514e31c8d service nova] Lock "d28e1e38-3ed5-468e-b672-8b94a909820c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:55:00 user nova-compute[70954]: DEBUG nova.compute.manager [req-3af9d7b5-2406-4a31-a516-e30e666ca5dc req-ecbadd18-812d-4623-8ed6-30d514e31c8d service nova] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] No waiting events found dispatching network-vif-plugged-92fab70c-ebc3-4119-8fa9-874304b51cb5 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:55:00 user nova-compute[70954]: WARNING nova.compute.manager [req-3af9d7b5-2406-4a31-a516-e30e666ca5dc req-ecbadd18-812d-4623-8ed6-30d514e31c8d service nova] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Received unexpected event network-vif-plugged-92fab70c-ebc3-4119-8fa9-874304b51cb5 for instance with vm_state deleted and task_state None. Apr 21 10:55:00 user nova-compute[70954]: DEBUG nova.compute.manager [req-3af9d7b5-2406-4a31-a516-e30e666ca5dc req-ecbadd18-812d-4623-8ed6-30d514e31c8d service nova] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Received event network-vif-deleted-92fab70c-ebc3-4119-8fa9-874304b51cb5 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:55:02 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:55:02 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:55:03 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:55:03 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Starting heal instance info cache {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 10:55:03 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Didn't find any instances for network info cache update. {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 21 10:55:03 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:55:03 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:55:04 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:55:04 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:55:04 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager.update_available_resource {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:55:04 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:55:04 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:55:04 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:55:04 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Auditing locally available compute resources for user (node: user) {{(pid=70954) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 10:55:04 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f3e0bc01-1cf2-4ff9-bec6-12a37e44171c/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:55:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f3e0bc01-1cf2-4ff9-bec6-12a37e44171c/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:55:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f3e0bc01-1cf2-4ff9-bec6-12a37e44171c/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:55:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f3e0bc01-1cf2-4ff9-bec6-12a37e44171c/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:55:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:55:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:55:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:55:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:55:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:55:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:55:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:55:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2/disk --force-share --output=json" returned: 0 in 0.129s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:55:06 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:55:06 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:55:06 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Hypervisor/Node resource view: name=user free_ram=8767MB free_disk=26.480587005615234GB free_vcpus=9 pci_devices=[{"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70954) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 10:55:06 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:55:06 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:55:06 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 84b55fc0-e748-4c05-97ad-a6994c0487d2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:55:06 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance f8609da3-c26d-482a-bc03-017baf4bce22 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:55:06 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance f3e0bc01-1cf2-4ff9-bec6-12a37e44171c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:55:06 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Total usable vcpus: 12, total allocated vcpus: 3 {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 10:55:06 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Final resource view: name=user phys_ram=16023MB used_ram=896MB phys_disk=40GB used_disk=3GB total_vcpus=12 used_vcpus=3 pci_stats=[] {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 10:55:06 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:55:06 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:55:06 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Compute_service record updated for user:user {{(pid=70954) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 10:55:06 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.250s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:55:08 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:55:09 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:55:09 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:55:09 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:55:11 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:55:12 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:55:12 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70954) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 10:55:13 user nova-compute[70954]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:55:13 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] VM Stopped (Lifecycle Event) Apr 21 10:55:13 user nova-compute[70954]: DEBUG nova.compute.manager [None req-9f7f2935-bfb6-45ef-9858-81c11fdd258f None None] [instance: d28e1e38-3ed5-468e-b672-8b94a909820c] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:55:13 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:55:15 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:55:18 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:55:22 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:55:23 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:55:28 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:55:33 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-65357e91-8af4-4fbb-ace7-81f46050dea6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Acquiring lock "84b55fc0-e748-4c05-97ad-a6994c0487d2" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:55:33 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-65357e91-8af4-4fbb-ace7-81f46050dea6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "84b55fc0-e748-4c05-97ad-a6994c0487d2" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:55:33 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-65357e91-8af4-4fbb-ace7-81f46050dea6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Acquiring lock "84b55fc0-e748-4c05-97ad-a6994c0487d2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:55:33 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-65357e91-8af4-4fbb-ace7-81f46050dea6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "84b55fc0-e748-4c05-97ad-a6994c0487d2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:55:33 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-65357e91-8af4-4fbb-ace7-81f46050dea6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "84b55fc0-e748-4c05-97ad-a6994c0487d2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:55:33 user nova-compute[70954]: INFO nova.compute.manager [None req-65357e91-8af4-4fbb-ace7-81f46050dea6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Terminating instance Apr 21 10:55:33 user nova-compute[70954]: DEBUG nova.compute.manager [None req-65357e91-8af4-4fbb-ace7-81f46050dea6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Start destroying the instance on the hypervisor. {{(pid=70954) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 10:55:33 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:55:33 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:55:33 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:55:33 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:55:33 user nova-compute[70954]: DEBUG nova.compute.manager [req-aa9a7202-897c-4e98-87e4-2f445f764e40 req-1547c05d-de8f-4881-8bb7-269d1f4b9c99 service nova] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Received event network-vif-unplugged-2a49817a-aed1-49bd-96b6-36286ff71e1c {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:55:33 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-aa9a7202-897c-4e98-87e4-2f445f764e40 req-1547c05d-de8f-4881-8bb7-269d1f4b9c99 service nova] Acquiring lock "84b55fc0-e748-4c05-97ad-a6994c0487d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:55:33 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-aa9a7202-897c-4e98-87e4-2f445f764e40 req-1547c05d-de8f-4881-8bb7-269d1f4b9c99 service nova] Lock "84b55fc0-e748-4c05-97ad-a6994c0487d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:55:33 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-aa9a7202-897c-4e98-87e4-2f445f764e40 req-1547c05d-de8f-4881-8bb7-269d1f4b9c99 service nova] Lock "84b55fc0-e748-4c05-97ad-a6994c0487d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:55:33 user nova-compute[70954]: DEBUG nova.compute.manager [req-aa9a7202-897c-4e98-87e4-2f445f764e40 req-1547c05d-de8f-4881-8bb7-269d1f4b9c99 service nova] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] No waiting events found dispatching network-vif-unplugged-2a49817a-aed1-49bd-96b6-36286ff71e1c {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:55:33 user nova-compute[70954]: DEBUG nova.compute.manager [req-aa9a7202-897c-4e98-87e4-2f445f764e40 req-1547c05d-de8f-4881-8bb7-269d1f4b9c99 service nova] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Received event network-vif-unplugged-2a49817a-aed1-49bd-96b6-36286ff71e1c for instance with task_state deleting. {{(pid=70954) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 10:55:33 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:55:34 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Instance destroyed successfully. Apr 21 10:55:34 user nova-compute[70954]: DEBUG nova.objects.instance [None req-65357e91-8af4-4fbb-ace7-81f46050dea6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lazy-loading 'resources' on Instance uuid 84b55fc0-e748-4c05-97ad-a6994c0487d2 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:55:34 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-65357e91-8af4-4fbb-ace7-81f46050dea6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:47:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-76757344',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-76757344',id=1,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-21T10:47:19Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='aad84a0e014f47ddaeaddc88bf16b0a8',ramdisk_id='',reservation_id='r-zj852vb0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-1980957418',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T10:48:59Z,user_data=None,user_id='54c67d90b6014d9ea24ef2552006bc04',uuid=84b55fc0-e748-4c05-97ad-a6994c0487d2,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2a49817a-aed1-49bd-96b6-36286ff71e1c", "address": "fa:16:3e:4f:2d:82", "network": {"id": "cfb4de90-44ea-486a-b5c4-c3b1111aa2bd", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1667019531-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "aad84a0e014f47ddaeaddc88bf16b0a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a49817a-ae", "ovs_interfaceid": "2a49817a-aed1-49bd-96b6-36286ff71e1c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 10:55:34 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-65357e91-8af4-4fbb-ace7-81f46050dea6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Converting VIF {"id": "2a49817a-aed1-49bd-96b6-36286ff71e1c", "address": "fa:16:3e:4f:2d:82", "network": {"id": "cfb4de90-44ea-486a-b5c4-c3b1111aa2bd", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1667019531-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "aad84a0e014f47ddaeaddc88bf16b0a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a49817a-ae", "ovs_interfaceid": "2a49817a-aed1-49bd-96b6-36286ff71e1c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:55:34 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-65357e91-8af4-4fbb-ace7-81f46050dea6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4f:2d:82,bridge_name='br-int',has_traffic_filtering=True,id=2a49817a-aed1-49bd-96b6-36286ff71e1c,network=Network(cfb4de90-44ea-486a-b5c4-c3b1111aa2bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a49817a-ae') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:55:34 user nova-compute[70954]: DEBUG os_vif [None req-65357e91-8af4-4fbb-ace7-81f46050dea6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4f:2d:82,bridge_name='br-int',has_traffic_filtering=True,id=2a49817a-aed1-49bd-96b6-36286ff71e1c,network=Network(cfb4de90-44ea-486a-b5c4-c3b1111aa2bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a49817a-ae') {{(pid=70954) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 10:55:34 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:55:34 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2a49817a-ae, bridge=br-int, if_exists=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:55:34 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:55:34 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:55:34 user nova-compute[70954]: INFO os_vif [None req-65357e91-8af4-4fbb-ace7-81f46050dea6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4f:2d:82,bridge_name='br-int',has_traffic_filtering=True,id=2a49817a-aed1-49bd-96b6-36286ff71e1c,network=Network(cfb4de90-44ea-486a-b5c4-c3b1111aa2bd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a49817a-ae') Apr 21 10:55:34 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-65357e91-8af4-4fbb-ace7-81f46050dea6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Deleting instance files /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2_del Apr 21 10:55:34 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-65357e91-8af4-4fbb-ace7-81f46050dea6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Deletion of /opt/stack/data/nova/instances/84b55fc0-e748-4c05-97ad-a6994c0487d2_del complete Apr 21 10:55:34 user nova-compute[70954]: INFO nova.compute.manager [None req-65357e91-8af4-4fbb-ace7-81f46050dea6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Took 0.84 seconds to destroy the instance on the hypervisor. Apr 21 10:55:34 user nova-compute[70954]: DEBUG oslo.service.loopingcall [None req-65357e91-8af4-4fbb-ace7-81f46050dea6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70954) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 10:55:34 user nova-compute[70954]: DEBUG nova.compute.manager [-] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Deallocating network for instance {{(pid=70954) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 10:55:34 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] deallocate_for_instance() {{(pid=70954) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 10:55:34 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Updating instance_info_cache with network_info: [] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:55:34 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Took 0.66 seconds to deallocate network for instance. Apr 21 10:55:34 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-65357e91-8af4-4fbb-ace7-81f46050dea6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:55:34 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-65357e91-8af4-4fbb-ace7-81f46050dea6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:55:35 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-65357e91-8af4-4fbb-ace7-81f46050dea6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:55:35 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-65357e91-8af4-4fbb-ace7-81f46050dea6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:55:35 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-65357e91-8af4-4fbb-ace7-81f46050dea6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.161s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:55:35 user nova-compute[70954]: INFO nova.scheduler.client.report [None req-65357e91-8af4-4fbb-ace7-81f46050dea6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Deleted allocations for instance 84b55fc0-e748-4c05-97ad-a6994c0487d2 Apr 21 10:55:35 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-65357e91-8af4-4fbb-ace7-81f46050dea6 tempest-ServerBootFromVolumeStableRescueTest-1980957418 tempest-ServerBootFromVolumeStableRescueTest-1980957418-project-member] Lock "84b55fc0-e748-4c05-97ad-a6994c0487d2" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.843s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:55:35 user nova-compute[70954]: DEBUG nova.compute.manager [req-8fb8b663-1dff-4882-8db2-88780bf94013 req-f92ea705-513d-47ea-a3b3-e4f4748ded05 service nova] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Received event network-vif-plugged-2a49817a-aed1-49bd-96b6-36286ff71e1c {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:55:35 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-8fb8b663-1dff-4882-8db2-88780bf94013 req-f92ea705-513d-47ea-a3b3-e4f4748ded05 service nova] Acquiring lock "84b55fc0-e748-4c05-97ad-a6994c0487d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:55:35 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-8fb8b663-1dff-4882-8db2-88780bf94013 req-f92ea705-513d-47ea-a3b3-e4f4748ded05 service nova] Lock "84b55fc0-e748-4c05-97ad-a6994c0487d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:55:35 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-8fb8b663-1dff-4882-8db2-88780bf94013 req-f92ea705-513d-47ea-a3b3-e4f4748ded05 service nova] Lock "84b55fc0-e748-4c05-97ad-a6994c0487d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:55:35 user nova-compute[70954]: DEBUG nova.compute.manager [req-8fb8b663-1dff-4882-8db2-88780bf94013 req-f92ea705-513d-47ea-a3b3-e4f4748ded05 service nova] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] No waiting events found dispatching network-vif-plugged-2a49817a-aed1-49bd-96b6-36286ff71e1c {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:55:35 user nova-compute[70954]: WARNING nova.compute.manager [req-8fb8b663-1dff-4882-8db2-88780bf94013 req-f92ea705-513d-47ea-a3b3-e4f4748ded05 service nova] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Received unexpected event network-vif-plugged-2a49817a-aed1-49bd-96b6-36286ff71e1c for instance with vm_state deleted and task_state None. Apr 21 10:55:35 user nova-compute[70954]: DEBUG nova.compute.manager [req-8fb8b663-1dff-4882-8db2-88780bf94013 req-f92ea705-513d-47ea-a3b3-e4f4748ded05 service nova] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Received event network-vif-deleted-2a49817a-aed1-49bd-96b6-36286ff71e1c {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:55:39 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:55:44 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:55:48 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f28a08f9-0f2d-47a9-8019-f4c01a01704c tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Acquiring lock "f3e0bc01-1cf2-4ff9-bec6-12a37e44171c" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:55:48 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f28a08f9-0f2d-47a9-8019-f4c01a01704c tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Lock "f3e0bc01-1cf2-4ff9-bec6-12a37e44171c" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:55:48 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f28a08f9-0f2d-47a9-8019-f4c01a01704c tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Acquiring lock "f3e0bc01-1cf2-4ff9-bec6-12a37e44171c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:55:48 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f28a08f9-0f2d-47a9-8019-f4c01a01704c tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Lock "f3e0bc01-1cf2-4ff9-bec6-12a37e44171c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:55:48 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f28a08f9-0f2d-47a9-8019-f4c01a01704c tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Lock "f3e0bc01-1cf2-4ff9-bec6-12a37e44171c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:55:48 user nova-compute[70954]: INFO nova.compute.manager [None req-f28a08f9-0f2d-47a9-8019-f4c01a01704c tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Terminating instance Apr 21 10:55:48 user nova-compute[70954]: DEBUG nova.compute.manager [None req-f28a08f9-0f2d-47a9-8019-f4c01a01704c tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Start destroying the instance on the hypervisor. {{(pid=70954) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 10:55:48 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:55:48 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:55:48 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:55:48 user nova-compute[70954]: DEBUG nova.compute.manager [req-94ea02dd-d604-4b30-9130-d0173e838e38 req-80882831-6e5d-4bf1-b28d-68046bfc3e7d service nova] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Received event network-vif-unplugged-ec8edd23-eb04-4e01-874f-7a5ad305eacc {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:55:48 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-94ea02dd-d604-4b30-9130-d0173e838e38 req-80882831-6e5d-4bf1-b28d-68046bfc3e7d service nova] Acquiring lock "f3e0bc01-1cf2-4ff9-bec6-12a37e44171c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:55:48 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-94ea02dd-d604-4b30-9130-d0173e838e38 req-80882831-6e5d-4bf1-b28d-68046bfc3e7d service nova] Lock "f3e0bc01-1cf2-4ff9-bec6-12a37e44171c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:55:48 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-94ea02dd-d604-4b30-9130-d0173e838e38 req-80882831-6e5d-4bf1-b28d-68046bfc3e7d service nova] Lock "f3e0bc01-1cf2-4ff9-bec6-12a37e44171c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:55:48 user nova-compute[70954]: DEBUG nova.compute.manager [req-94ea02dd-d604-4b30-9130-d0173e838e38 req-80882831-6e5d-4bf1-b28d-68046bfc3e7d service nova] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] No waiting events found dispatching network-vif-unplugged-ec8edd23-eb04-4e01-874f-7a5ad305eacc {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:55:48 user nova-compute[70954]: DEBUG nova.compute.manager [req-94ea02dd-d604-4b30-9130-d0173e838e38 req-80882831-6e5d-4bf1-b28d-68046bfc3e7d service nova] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Received event network-vif-unplugged-ec8edd23-eb04-4e01-874f-7a5ad305eacc for instance with task_state deleting. {{(pid=70954) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 10:55:49 user nova-compute[70954]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:55:49 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] VM Stopped (Lifecycle Event) Apr 21 10:55:49 user nova-compute[70954]: DEBUG nova.compute.manager [None req-45602677-8d17-4a4f-adab-8506a9abed0a None None] [instance: 84b55fc0-e748-4c05-97ad-a6994c0487d2] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:55:49 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:55:49 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Instance destroyed successfully. Apr 21 10:55:49 user nova-compute[70954]: DEBUG nova.objects.instance [None req-f28a08f9-0f2d-47a9-8019-f4c01a01704c tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Lazy-loading 'resources' on Instance uuid f3e0bc01-1cf2-4ff9-bec6-12a37e44171c {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:55:49 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-f28a08f9-0f2d-47a9-8019-f4c01a01704c tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:53:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-843445589',display_name='tempest-TestMinimumBasicScenario-server-843445589',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-843445589',id=17,image_ref='d5ab5add-8fb7-4436-88e9-0cb945ddc863',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG5kEQk6hWARzkdAxYq+PejmYD/iouSlgR4Vlro63NFT4BV2SLZZTcJkpc9dGXv7UnIQQu9gOJSnEV3QeSQLSLjalHZp/U4BZsLF5Brm42aa21bVfKcjzsWQHuJOfIGbHg==',key_name='tempest-TestMinimumBasicScenario-1698854069',keypairs=,launch_index=0,launched_at=2023-04-21T10:54:03Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='9ead44a7da0640cbb2cf8dece0ea4f40',ramdisk_id='',reservation_id='r-2uysnlk0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='d5ab5add-8fb7-4436-88e9-0cb945ddc863',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestMinimumBasicScenario-339882477',owner_user_name='tempest-TestMinimumBasicScenario-339882477-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T10:54:03Z,user_data=None,user_id='6cae5a1734d24ac8aebc233dd31d3084',uuid=f3e0bc01-1cf2-4ff9-bec6-12a37e44171c,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ec8edd23-eb04-4e01-874f-7a5ad305eacc", "address": "fa:16:3e:62:75:ec", "network": {"id": "89cc600b-891d-4913-9f39-935f5c5bce86", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1522370995-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9ead44a7da0640cbb2cf8dece0ea4f40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapec8edd23-eb", "ovs_interfaceid": "ec8edd23-eb04-4e01-874f-7a5ad305eacc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 10:55:49 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-f28a08f9-0f2d-47a9-8019-f4c01a01704c tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Converting VIF {"id": "ec8edd23-eb04-4e01-874f-7a5ad305eacc", "address": "fa:16:3e:62:75:ec", "network": {"id": "89cc600b-891d-4913-9f39-935f5c5bce86", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1522370995-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9ead44a7da0640cbb2cf8dece0ea4f40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapec8edd23-eb", "ovs_interfaceid": "ec8edd23-eb04-4e01-874f-7a5ad305eacc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:55:49 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-f28a08f9-0f2d-47a9-8019-f4c01a01704c tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:75:ec,bridge_name='br-int',has_traffic_filtering=True,id=ec8edd23-eb04-4e01-874f-7a5ad305eacc,network=Network(89cc600b-891d-4913-9f39-935f5c5bce86),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec8edd23-eb') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:55:49 user nova-compute[70954]: DEBUG os_vif [None req-f28a08f9-0f2d-47a9-8019-f4c01a01704c tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:75:ec,bridge_name='br-int',has_traffic_filtering=True,id=ec8edd23-eb04-4e01-874f-7a5ad305eacc,network=Network(89cc600b-891d-4913-9f39-935f5c5bce86),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec8edd23-eb') {{(pid=70954) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 10:55:49 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:55:49 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec8edd23-eb, bridge=br-int, if_exists=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:55:49 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:55:49 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:55:49 user nova-compute[70954]: INFO os_vif [None req-f28a08f9-0f2d-47a9-8019-f4c01a01704c tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:75:ec,bridge_name='br-int',has_traffic_filtering=True,id=ec8edd23-eb04-4e01-874f-7a5ad305eacc,network=Network(89cc600b-891d-4913-9f39-935f5c5bce86),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec8edd23-eb') Apr 21 10:55:49 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-f28a08f9-0f2d-47a9-8019-f4c01a01704c tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Deleting instance files /opt/stack/data/nova/instances/f3e0bc01-1cf2-4ff9-bec6-12a37e44171c_del Apr 21 10:55:49 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-f28a08f9-0f2d-47a9-8019-f4c01a01704c tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Deletion of /opt/stack/data/nova/instances/f3e0bc01-1cf2-4ff9-bec6-12a37e44171c_del complete Apr 21 10:55:49 user nova-compute[70954]: INFO nova.compute.manager [None req-f28a08f9-0f2d-47a9-8019-f4c01a01704c tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Took 0.84 seconds to destroy the instance on the hypervisor. Apr 21 10:55:49 user nova-compute[70954]: DEBUG oslo.service.loopingcall [None req-f28a08f9-0f2d-47a9-8019-f4c01a01704c tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70954) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 10:55:49 user nova-compute[70954]: DEBUG nova.compute.manager [-] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Deallocating network for instance {{(pid=70954) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 10:55:49 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] deallocate_for_instance() {{(pid=70954) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 10:55:49 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Updating instance_info_cache with network_info: [] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:55:49 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Took 0.48 seconds to deallocate network for instance. Apr 21 10:55:49 user nova-compute[70954]: DEBUG nova.compute.manager [req-80fa3837-670d-4f8b-b58c-327fd66ec919 req-c1447460-842d-4249-94c2-52da89e9588b service nova] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Received event network-vif-deleted-ec8edd23-eb04-4e01-874f-7a5ad305eacc {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:55:49 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f28a08f9-0f2d-47a9-8019-f4c01a01704c tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:55:49 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f28a08f9-0f2d-47a9-8019-f4c01a01704c tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:55:49 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-f28a08f9-0f2d-47a9-8019-f4c01a01704c tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:55:49 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-f28a08f9-0f2d-47a9-8019-f4c01a01704c tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:55:49 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f28a08f9-0f2d-47a9-8019-f4c01a01704c tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.120s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:55:50 user nova-compute[70954]: INFO nova.scheduler.client.report [None req-f28a08f9-0f2d-47a9-8019-f4c01a01704c tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Deleted allocations for instance f3e0bc01-1cf2-4ff9-bec6-12a37e44171c Apr 21 10:55:50 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f28a08f9-0f2d-47a9-8019-f4c01a01704c tempest-TestMinimumBasicScenario-339882477 tempest-TestMinimumBasicScenario-339882477-project-member] Lock "f3e0bc01-1cf2-4ff9-bec6-12a37e44171c" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.615s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:55:50 user nova-compute[70954]: DEBUG nova.compute.manager [req-3193da52-2ce8-487f-93cc-1520f91e52e2 req-73c21077-a704-46a3-80e1-3fdde6dac9de service nova] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Received event network-vif-plugged-ec8edd23-eb04-4e01-874f-7a5ad305eacc {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:55:50 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-3193da52-2ce8-487f-93cc-1520f91e52e2 req-73c21077-a704-46a3-80e1-3fdde6dac9de service nova] Acquiring lock "f3e0bc01-1cf2-4ff9-bec6-12a37e44171c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:55:50 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-3193da52-2ce8-487f-93cc-1520f91e52e2 req-73c21077-a704-46a3-80e1-3fdde6dac9de service nova] Lock "f3e0bc01-1cf2-4ff9-bec6-12a37e44171c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:55:50 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-3193da52-2ce8-487f-93cc-1520f91e52e2 req-73c21077-a704-46a3-80e1-3fdde6dac9de service nova] Lock "f3e0bc01-1cf2-4ff9-bec6-12a37e44171c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:55:50 user nova-compute[70954]: DEBUG nova.compute.manager [req-3193da52-2ce8-487f-93cc-1520f91e52e2 req-73c21077-a704-46a3-80e1-3fdde6dac9de service nova] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] No waiting events found dispatching network-vif-plugged-ec8edd23-eb04-4e01-874f-7a5ad305eacc {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:55:50 user nova-compute[70954]: WARNING nova.compute.manager [req-3193da52-2ce8-487f-93cc-1520f91e52e2 req-73c21077-a704-46a3-80e1-3fdde6dac9de service nova] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Received unexpected event network-vif-plugged-ec8edd23-eb04-4e01-874f-7a5ad305eacc for instance with vm_state deleted and task_state None. Apr 21 10:55:52 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Acquiring lock "c70df604-601e-4451-828d-20c649b6052a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:55:52 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "c70df604-601e-4451-828d-20c649b6052a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:55:52 user nova-compute[70954]: DEBUG nova.compute.manager [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: c70df604-601e-4451-828d-20c649b6052a] Starting instance... {{(pid=70954) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 10:55:52 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:55:52 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:55:52 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70954) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 10:55:52 user nova-compute[70954]: INFO nova.compute.claims [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: c70df604-601e-4451-828d-20c649b6052a] Claim successful on node user Apr 21 10:55:52 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:55:52 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:55:52 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.198s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:55:52 user nova-compute[70954]: DEBUG nova.compute.manager [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: c70df604-601e-4451-828d-20c649b6052a] Start building networks asynchronously for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 10:55:52 user nova-compute[70954]: DEBUG nova.compute.manager [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: c70df604-601e-4451-828d-20c649b6052a] Allocating IP information in the background. {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 10:55:52 user nova-compute[70954]: DEBUG nova.network.neutron [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: c70df604-601e-4451-828d-20c649b6052a] allocate_for_instance() {{(pid=70954) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 10:55:52 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: c70df604-601e-4451-828d-20c649b6052a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 10:55:52 user nova-compute[70954]: DEBUG nova.compute.manager [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: c70df604-601e-4451-828d-20c649b6052a] Start building block device mappings for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 10:55:52 user nova-compute[70954]: DEBUG nova.policy [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd7fc66871488428e9842404d885bcfe3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '14bc6b0c20204c8287b3523814007856', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70954) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 10:55:52 user nova-compute[70954]: DEBUG nova.compute.manager [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: c70df604-601e-4451-828d-20c649b6052a] Start spawning the instance on the hypervisor. {{(pid=70954) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 10:55:52 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: c70df604-601e-4451-828d-20c649b6052a] Creating instance directory {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 10:55:52 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: c70df604-601e-4451-828d-20c649b6052a] Creating image(s) Apr 21 10:55:52 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Acquiring lock "/opt/stack/data/nova/instances/c70df604-601e-4451-828d-20c649b6052a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:55:52 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "/opt/stack/data/nova/instances/c70df604-601e-4451-828d-20c649b6052a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:55:52 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "/opt/stack/data/nova/instances/c70df604-601e-4451-828d-20c649b6052a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:55:52 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:55:52 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.134s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:55:52 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Acquiring lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:55:52 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:55:52 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:55:52 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.130s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:55:52 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/c70df604-601e-4451-828d-20c649b6052a/disk 1073741824 {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:55:52 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/c70df604-601e-4451-828d-20c649b6052a/disk 1073741824" returned: 0 in 0.055s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:55:52 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.192s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:55:52 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:55:53 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.138s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:55:53 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Checking if we can resize image /opt/stack/data/nova/instances/c70df604-601e-4451-828d-20c649b6052a/disk. size=1073741824 {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 10:55:53 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c70df604-601e-4451-828d-20c649b6052a/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:55:53 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c70df604-601e-4451-828d-20c649b6052a/disk --force-share --output=json" returned: 0 in 0.146s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:55:53 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Cannot resize image /opt/stack/data/nova/instances/c70df604-601e-4451-828d-20c649b6052a/disk to a smaller size. {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 10:55:53 user nova-compute[70954]: DEBUG nova.objects.instance [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lazy-loading 'migration_context' on Instance uuid c70df604-601e-4451-828d-20c649b6052a {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:55:53 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: c70df604-601e-4451-828d-20c649b6052a] Created local disks {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 10:55:53 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: c70df604-601e-4451-828d-20c649b6052a] Ensure instance console log exists: /opt/stack/data/nova/instances/c70df604-601e-4451-828d-20c649b6052a/console.log {{(pid=70954) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 10:55:53 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:55:53 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:55:53 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:55:53 user nova-compute[70954]: DEBUG nova.network.neutron [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: c70df604-601e-4451-828d-20c649b6052a] Successfully created port: 5751ca80-5041-4999-b832-b427fb0af8a2 {{(pid=70954) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 10:55:53 user nova-compute[70954]: DEBUG nova.network.neutron [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: c70df604-601e-4451-828d-20c649b6052a] Successfully updated port: 5751ca80-5041-4999-b832-b427fb0af8a2 {{(pid=70954) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 10:55:53 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Acquiring lock "refresh_cache-c70df604-601e-4451-828d-20c649b6052a" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:55:53 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Acquired lock "refresh_cache-c70df604-601e-4451-828d-20c649b6052a" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:55:53 user nova-compute[70954]: DEBUG nova.network.neutron [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: c70df604-601e-4451-828d-20c649b6052a] Building network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 10:55:54 user nova-compute[70954]: DEBUG nova.compute.manager [req-a1e7c6d8-a97c-4b11-a021-168d803c1123 req-7e24370d-dc38-4dbd-953c-5b3fbe5e91b8 service nova] [instance: c70df604-601e-4451-828d-20c649b6052a] Received event network-changed-5751ca80-5041-4999-b832-b427fb0af8a2 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:55:54 user nova-compute[70954]: DEBUG nova.compute.manager [req-a1e7c6d8-a97c-4b11-a021-168d803c1123 req-7e24370d-dc38-4dbd-953c-5b3fbe5e91b8 service nova] [instance: c70df604-601e-4451-828d-20c649b6052a] Refreshing instance network info cache due to event network-changed-5751ca80-5041-4999-b832-b427fb0af8a2. {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 10:55:54 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-a1e7c6d8-a97c-4b11-a021-168d803c1123 req-7e24370d-dc38-4dbd-953c-5b3fbe5e91b8 service nova] Acquiring lock "refresh_cache-c70df604-601e-4451-828d-20c649b6052a" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:55:54 user nova-compute[70954]: DEBUG nova.network.neutron [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: c70df604-601e-4451-828d-20c649b6052a] Instance cache missing network info. {{(pid=70954) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 10:55:54 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:55:54 user nova-compute[70954]: DEBUG nova.network.neutron [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: c70df604-601e-4451-828d-20c649b6052a] Updating instance_info_cache with network_info: [{"id": "5751ca80-5041-4999-b832-b427fb0af8a2", "address": "fa:16:3e:5f:74:cd", "network": {"id": "e0ccd2d9-69df-40e0-be8e-8328039f1bd0", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-587901453-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "14bc6b0c20204c8287b3523814007856", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5751ca80-50", "ovs_interfaceid": "5751ca80-5041-4999-b832-b427fb0af8a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:55:54 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Releasing lock "refresh_cache-c70df604-601e-4451-828d-20c649b6052a" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:55:54 user nova-compute[70954]: DEBUG nova.compute.manager [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: c70df604-601e-4451-828d-20c649b6052a] Instance network_info: |[{"id": "5751ca80-5041-4999-b832-b427fb0af8a2", "address": "fa:16:3e:5f:74:cd", "network": {"id": "e0ccd2d9-69df-40e0-be8e-8328039f1bd0", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-587901453-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "14bc6b0c20204c8287b3523814007856", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5751ca80-50", "ovs_interfaceid": "5751ca80-5041-4999-b832-b427fb0af8a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 10:55:54 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-a1e7c6d8-a97c-4b11-a021-168d803c1123 req-7e24370d-dc38-4dbd-953c-5b3fbe5e91b8 service nova] Acquired lock "refresh_cache-c70df604-601e-4451-828d-20c649b6052a" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:55:54 user nova-compute[70954]: DEBUG nova.network.neutron [req-a1e7c6d8-a97c-4b11-a021-168d803c1123 req-7e24370d-dc38-4dbd-953c-5b3fbe5e91b8 service nova] [instance: c70df604-601e-4451-828d-20c649b6052a] Refreshing network info cache for port 5751ca80-5041-4999-b832-b427fb0af8a2 {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 10:55:54 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: c70df604-601e-4451-828d-20c649b6052a] Start _get_guest_xml network_info=[{"id": "5751ca80-5041-4999-b832-b427fb0af8a2", "address": "fa:16:3e:5f:74:cd", "network": {"id": "e0ccd2d9-69df-40e0-be8e-8328039f1bd0", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-587901453-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "14bc6b0c20204c8287b3523814007856", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5751ca80-50", "ovs_interfaceid": "5751ca80-5041-4999-b832-b427fb0af8a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:43:25Z,direct_url=,disk_format='qcow2',id=3b29a01a-1fc0-4d0d-89fb-23d22b2de02e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='a3109aa78f014d0da3638064a889676d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:43:26Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'size': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'disk_bus': 'virtio', 'encryption_format': None, 'guest_format': None, 'image_id': '3b29a01a-1fc0-4d0d-89fb-23d22b2de02e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 10:55:54 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:55:54 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:55:54 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70954) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 10:55:54 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T10:44:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:43:25Z,direct_url=,disk_format='qcow2',id=3b29a01a-1fc0-4d0d-89fb-23d22b2de02e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='a3109aa78f014d0da3638064a889676d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:43:26Z,virtual_size=,visibility=), allow threads: True {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 10:55:54 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Flavor limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 10:55:54 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Image limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 10:55:54 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Flavor pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 10:55:54 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Image pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 10:55:54 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 10:55:54 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 10:55:54 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 10:55:54 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Got 1 possible topologies {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 10:55:54 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 10:55:54 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 10:55:54 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:55:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1645771321',display_name='tempest-AttachVolumeNegativeTest-server-1645771321',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-1645771321',id=18,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHp6Xpoy4by46AxWCAbGiYDwkQpafANsc/yEMfhdZjZu+ouzXhRTZE2gofYmbc0DufFTp50aDb84APASGieWAMislJ/20uHZwVFUyEEqhMGv/VNdPqJyM1DgvCFkYCmKMQ==',key_name='tempest-keypair-610711731',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='14bc6b0c20204c8287b3523814007856',ramdisk_id='',reservation_id='r-7t5z091m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-159654333',owner_user_name='tempest-AttachVolumeNegativeTest-159654333-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:55:53Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d7fc66871488428e9842404d885bcfe3',uuid=c70df604-601e-4451-828d-20c649b6052a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5751ca80-5041-4999-b832-b427fb0af8a2", "address": "fa:16:3e:5f:74:cd", "network": {"id": "e0ccd2d9-69df-40e0-be8e-8328039f1bd0", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-587901453-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "14bc6b0c20204c8287b3523814007856", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5751ca80-50", "ovs_interfaceid": "5751ca80-5041-4999-b832-b427fb0af8a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70954) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 10:55:54 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Converting VIF {"id": "5751ca80-5041-4999-b832-b427fb0af8a2", "address": "fa:16:3e:5f:74:cd", "network": {"id": "e0ccd2d9-69df-40e0-be8e-8328039f1bd0", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-587901453-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "14bc6b0c20204c8287b3523814007856", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5751ca80-50", "ovs_interfaceid": "5751ca80-5041-4999-b832-b427fb0af8a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:55:54 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:74:cd,bridge_name='br-int',has_traffic_filtering=True,id=5751ca80-5041-4999-b832-b427fb0af8a2,network=Network(e0ccd2d9-69df-40e0-be8e-8328039f1bd0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5751ca80-50') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:55:54 user nova-compute[70954]: DEBUG nova.objects.instance [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lazy-loading 'pci_devices' on Instance uuid c70df604-601e-4451-828d-20c649b6052a {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:55:54 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: c70df604-601e-4451-828d-20c649b6052a] End _get_guest_xml xml= Apr 21 10:55:54 user nova-compute[70954]: c70df604-601e-4451-828d-20c649b6052a Apr 21 10:55:54 user nova-compute[70954]: instance-00000012 Apr 21 10:55:54 user nova-compute[70954]: 131072 Apr 21 10:55:54 user nova-compute[70954]: 1 Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: tempest-AttachVolumeNegativeTest-server-1645771321 Apr 21 10:55:54 user nova-compute[70954]: 2023-04-21 10:55:54 Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: 128 Apr 21 10:55:54 user nova-compute[70954]: 1 Apr 21 10:55:54 user nova-compute[70954]: 0 Apr 21 10:55:54 user nova-compute[70954]: 0 Apr 21 10:55:54 user nova-compute[70954]: 1 Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: tempest-AttachVolumeNegativeTest-159654333-project-member Apr 21 10:55:54 user nova-compute[70954]: tempest-AttachVolumeNegativeTest-159654333 Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: OpenStack Foundation Apr 21 10:55:54 user nova-compute[70954]: OpenStack Nova Apr 21 10:55:54 user nova-compute[70954]: 0.0.0 Apr 21 10:55:54 user nova-compute[70954]: c70df604-601e-4451-828d-20c649b6052a Apr 21 10:55:54 user nova-compute[70954]: c70df604-601e-4451-828d-20c649b6052a Apr 21 10:55:54 user nova-compute[70954]: Virtual Machine Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: hvm Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Nehalem Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: /dev/urandom Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: Apr 21 10:55:54 user nova-compute[70954]: {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 10:55:54 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:55:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1645771321',display_name='tempest-AttachVolumeNegativeTest-server-1645771321',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-1645771321',id=18,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHp6Xpoy4by46AxWCAbGiYDwkQpafANsc/yEMfhdZjZu+ouzXhRTZE2gofYmbc0DufFTp50aDb84APASGieWAMislJ/20uHZwVFUyEEqhMGv/VNdPqJyM1DgvCFkYCmKMQ==',key_name='tempest-keypair-610711731',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='14bc6b0c20204c8287b3523814007856',ramdisk_id='',reservation_id='r-7t5z091m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-159654333',owner_user_name='tempest-AttachVolumeNegativeTest-159654333-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:55:53Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d7fc66871488428e9842404d885bcfe3',uuid=c70df604-601e-4451-828d-20c649b6052a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5751ca80-5041-4999-b832-b427fb0af8a2", "address": "fa:16:3e:5f:74:cd", "network": {"id": "e0ccd2d9-69df-40e0-be8e-8328039f1bd0", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-587901453-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "14bc6b0c20204c8287b3523814007856", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5751ca80-50", "ovs_interfaceid": "5751ca80-5041-4999-b832-b427fb0af8a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 10:55:54 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Converting VIF {"id": "5751ca80-5041-4999-b832-b427fb0af8a2", "address": "fa:16:3e:5f:74:cd", "network": {"id": "e0ccd2d9-69df-40e0-be8e-8328039f1bd0", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-587901453-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "14bc6b0c20204c8287b3523814007856", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5751ca80-50", "ovs_interfaceid": "5751ca80-5041-4999-b832-b427fb0af8a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:55:54 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:74:cd,bridge_name='br-int',has_traffic_filtering=True,id=5751ca80-5041-4999-b832-b427fb0af8a2,network=Network(e0ccd2d9-69df-40e0-be8e-8328039f1bd0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5751ca80-50') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:55:54 user nova-compute[70954]: DEBUG os_vif [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:74:cd,bridge_name='br-int',has_traffic_filtering=True,id=5751ca80-5041-4999-b832-b427fb0af8a2,network=Network(e0ccd2d9-69df-40e0-be8e-8328039f1bd0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5751ca80-50') {{(pid=70954) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 10:55:54 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:55:54 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:55:54 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 10:55:54 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:55:54 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5751ca80-50, may_exist=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:55:54 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5751ca80-50, col_values=(('external_ids', {'iface-id': '5751ca80-5041-4999-b832-b427fb0af8a2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5f:74:cd', 'vm-uuid': 'c70df604-601e-4451-828d-20c649b6052a'}),)) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:55:54 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:55:54 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:55:54 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:55:54 user nova-compute[70954]: INFO os_vif [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:74:cd,bridge_name='br-int',has_traffic_filtering=True,id=5751ca80-5041-4999-b832-b427fb0af8a2,network=Network(e0ccd2d9-69df-40e0-be8e-8328039f1bd0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5751ca80-50') Apr 21 10:55:54 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] No BDM found with device name vda, not building metadata. {{(pid=70954) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 10:55:54 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] No VIF found with MAC fa:16:3e:5f:74:cd, not building metadata {{(pid=70954) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 10:55:55 user nova-compute[70954]: DEBUG nova.network.neutron [req-a1e7c6d8-a97c-4b11-a021-168d803c1123 req-7e24370d-dc38-4dbd-953c-5b3fbe5e91b8 service nova] [instance: c70df604-601e-4451-828d-20c649b6052a] Updated VIF entry in instance network info cache for port 5751ca80-5041-4999-b832-b427fb0af8a2. {{(pid=70954) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 10:55:55 user nova-compute[70954]: DEBUG nova.network.neutron [req-a1e7c6d8-a97c-4b11-a021-168d803c1123 req-7e24370d-dc38-4dbd-953c-5b3fbe5e91b8 service nova] [instance: c70df604-601e-4451-828d-20c649b6052a] Updating instance_info_cache with network_info: [{"id": "5751ca80-5041-4999-b832-b427fb0af8a2", "address": "fa:16:3e:5f:74:cd", "network": {"id": "e0ccd2d9-69df-40e0-be8e-8328039f1bd0", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-587901453-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "14bc6b0c20204c8287b3523814007856", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5751ca80-50", "ovs_interfaceid": "5751ca80-5041-4999-b832-b427fb0af8a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:55:55 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-a1e7c6d8-a97c-4b11-a021-168d803c1123 req-7e24370d-dc38-4dbd-953c-5b3fbe5e91b8 service nova] Releasing lock "refresh_cache-c70df604-601e-4451-828d-20c649b6052a" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:55:55 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:55:55 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:55:55 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:55:55 user nova-compute[70954]: DEBUG nova.compute.manager [req-db3a53fe-3d55-4177-8e04-271a80af59ff req-735f2519-3396-48fe-ad93-79a02af9891c service nova] [instance: c70df604-601e-4451-828d-20c649b6052a] Received event network-vif-plugged-5751ca80-5041-4999-b832-b427fb0af8a2 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:55:55 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-db3a53fe-3d55-4177-8e04-271a80af59ff req-735f2519-3396-48fe-ad93-79a02af9891c service nova] Acquiring lock "c70df604-601e-4451-828d-20c649b6052a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:55:55 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-db3a53fe-3d55-4177-8e04-271a80af59ff req-735f2519-3396-48fe-ad93-79a02af9891c service nova] Lock "c70df604-601e-4451-828d-20c649b6052a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:55:55 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-db3a53fe-3d55-4177-8e04-271a80af59ff req-735f2519-3396-48fe-ad93-79a02af9891c service nova] Lock "c70df604-601e-4451-828d-20c649b6052a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:55:55 user nova-compute[70954]: DEBUG nova.compute.manager [req-db3a53fe-3d55-4177-8e04-271a80af59ff req-735f2519-3396-48fe-ad93-79a02af9891c service nova] [instance: c70df604-601e-4451-828d-20c649b6052a] No waiting events found dispatching network-vif-plugged-5751ca80-5041-4999-b832-b427fb0af8a2 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:55:55 user nova-compute[70954]: WARNING nova.compute.manager [req-db3a53fe-3d55-4177-8e04-271a80af59ff req-735f2519-3396-48fe-ad93-79a02af9891c service nova] [instance: c70df604-601e-4451-828d-20c649b6052a] Received unexpected event network-vif-plugged-5751ca80-5041-4999-b832-b427fb0af8a2 for instance with vm_state building and task_state spawning. Apr 21 10:55:56 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:55:56 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:55:56 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:55:57 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Resumed> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:55:57 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: c70df604-601e-4451-828d-20c649b6052a] VM Resumed (Lifecycle Event) Apr 21 10:55:57 user nova-compute[70954]: DEBUG nova.compute.manager [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: c70df604-601e-4451-828d-20c649b6052a] Instance event wait completed in 0 seconds for {{(pid=70954) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 10:55:57 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: c70df604-601e-4451-828d-20c649b6052a] Guest created on hypervisor {{(pid=70954) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 10:55:57 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: c70df604-601e-4451-828d-20c649b6052a] Instance spawned successfully. Apr 21 10:55:57 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: c70df604-601e-4451-828d-20c649b6052a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 10:55:57 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: c70df604-601e-4451-828d-20c649b6052a] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:55:57 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: c70df604-601e-4451-828d-20c649b6052a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:55:57 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: c70df604-601e-4451-828d-20c649b6052a] Found default for hw_cdrom_bus of ide {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:55:57 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: c70df604-601e-4451-828d-20c649b6052a] Found default for hw_disk_bus of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:55:57 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: c70df604-601e-4451-828d-20c649b6052a] Found default for hw_input_bus of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:55:57 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: c70df604-601e-4451-828d-20c649b6052a] Found default for hw_pointer_model of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:55:57 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: c70df604-601e-4451-828d-20c649b6052a] Found default for hw_video_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:55:57 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: c70df604-601e-4451-828d-20c649b6052a] Found default for hw_vif_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:55:57 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: c70df604-601e-4451-828d-20c649b6052a] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 10:55:57 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Started> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:55:57 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: c70df604-601e-4451-828d-20c649b6052a] VM Started (Lifecycle Event) Apr 21 10:55:57 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: c70df604-601e-4451-828d-20c649b6052a] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:55:57 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: c70df604-601e-4451-828d-20c649b6052a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:55:57 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: c70df604-601e-4451-828d-20c649b6052a] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 10:55:57 user nova-compute[70954]: INFO nova.compute.manager [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: c70df604-601e-4451-828d-20c649b6052a] Took 5.30 seconds to spawn the instance on the hypervisor. Apr 21 10:55:57 user nova-compute[70954]: DEBUG nova.compute.manager [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: c70df604-601e-4451-828d-20c649b6052a] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:55:57 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:55:57 user nova-compute[70954]: INFO nova.compute.manager [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: c70df604-601e-4451-828d-20c649b6052a] Took 5.79 seconds to build instance. Apr 21 10:55:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-4c31f005-a915-4da5-b60e-f1454bb6e0cc tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "c70df604-601e-4451-828d-20c649b6052a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 5.898s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:55:58 user nova-compute[70954]: DEBUG nova.compute.manager [req-e7efc24f-0e41-4ddf-a852-e9210bdb4a42 req-a2835900-9031-4873-82b5-66b1d17a5d30 service nova] [instance: c70df604-601e-4451-828d-20c649b6052a] Received event network-vif-plugged-5751ca80-5041-4999-b832-b427fb0af8a2 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:55:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-e7efc24f-0e41-4ddf-a852-e9210bdb4a42 req-a2835900-9031-4873-82b5-66b1d17a5d30 service nova] Acquiring lock "c70df604-601e-4451-828d-20c649b6052a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:55:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-e7efc24f-0e41-4ddf-a852-e9210bdb4a42 req-a2835900-9031-4873-82b5-66b1d17a5d30 service nova] Lock "c70df604-601e-4451-828d-20c649b6052a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:55:58 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-e7efc24f-0e41-4ddf-a852-e9210bdb4a42 req-a2835900-9031-4873-82b5-66b1d17a5d30 service nova] Lock "c70df604-601e-4451-828d-20c649b6052a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:55:58 user nova-compute[70954]: DEBUG nova.compute.manager [req-e7efc24f-0e41-4ddf-a852-e9210bdb4a42 req-a2835900-9031-4873-82b5-66b1d17a5d30 service nova] [instance: c70df604-601e-4451-828d-20c649b6052a] No waiting events found dispatching network-vif-plugged-5751ca80-5041-4999-b832-b427fb0af8a2 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:55:58 user nova-compute[70954]: WARNING nova.compute.manager [req-e7efc24f-0e41-4ddf-a852-e9210bdb4a42 req-a2835900-9031-4873-82b5-66b1d17a5d30 service nova] [instance: c70df604-601e-4451-828d-20c649b6052a] Received unexpected event network-vif-plugged-5751ca80-5041-4999-b832-b427fb0af8a2 for instance with vm_state active and task_state None. Apr 21 10:55:59 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:56:02 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:56:03 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:56:03 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Starting heal instance info cache {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 10:56:03 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Rebuilding the list of instances to heal {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 21 10:56:03 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "refresh_cache-f8609da3-c26d-482a-bc03-017baf4bce22" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:56:03 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquired lock "refresh_cache-f8609da3-c26d-482a-bc03-017baf4bce22" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:56:03 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Forcefully refreshing network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 21 10:56:03 user nova-compute[70954]: DEBUG nova.objects.instance [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lazy-loading 'info_cache' on Instance uuid f8609da3-c26d-482a-bc03-017baf4bce22 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:56:04 user nova-compute[70954]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:56:04 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] VM Stopped (Lifecycle Event) Apr 21 10:56:04 user nova-compute[70954]: DEBUG nova.compute.manager [None req-3addf661-095f-4653-8bd1-70dd8e96a030 None None] [instance: f3e0bc01-1cf2-4ff9-bec6-12a37e44171c] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:56:04 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Updating instance_info_cache with network_info: [{"id": "f210779b-302b-4a17-8b57-07837ea54e12", "address": "fa:16:3e:c3:c6:d1", "network": {"id": "ba8e9ff2-e562-462e-a2fa-0d7f643da26c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-83296950-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.39", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "648163a728fc4b28b85a24e9198d356b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf210779b-30", "ovs_interfaceid": "f210779b-302b-4a17-8b57-07837ea54e12", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:56:04 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Releasing lock "refresh_cache-f8609da3-c26d-482a-bc03-017baf4bce22" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:56:04 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Updated the network info_cache for instance {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 21 10:56:04 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:56:04 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:56:04 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:56:04 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:56:04 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:56:04 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager.update_available_resource {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:56:04 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:56:04 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:56:04 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:56:04 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Auditing locally available compute resources for user (node: user) {{(pid=70954) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 10:56:04 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c70df604-601e-4451-828d-20c649b6052a/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:56:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c70df604-601e-4451-828d-20c649b6052a/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:56:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c70df604-601e-4451-828d-20c649b6052a/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:56:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c70df604-601e-4451-828d-20c649b6052a/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:56:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:56:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:56:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:56:05 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:56:05 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:56:05 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:56:05 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Hypervisor/Node resource view: name=user free_ram=8852MB free_disk=26.552791595458984GB free_vcpus=10 pci_devices=[{"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70954) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 10:56:05 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:56:05 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:56:06 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance f8609da3-c26d-482a-bc03-017baf4bce22 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:56:06 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance c70df604-601e-4451-828d-20c649b6052a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:56:06 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Total usable vcpus: 12, total allocated vcpus: 2 {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 10:56:06 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Final resource view: name=user phys_ram=16023MB used_ram=768MB phys_disk=40GB used_disk=2GB total_vcpus=12 used_vcpus=2 pci_stats=[] {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 10:56:06 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:56:06 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:56:06 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Compute_service record updated for user:user {{(pid=70954) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 10:56:06 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.246s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:56:07 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-46507153-9774-4c6b-ae6d-d2448087cdae tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Acquiring lock "f8609da3-c26d-482a-bc03-017baf4bce22" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:56:07 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-46507153-9774-4c6b-ae6d-d2448087cdae tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Lock "f8609da3-c26d-482a-bc03-017baf4bce22" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:56:07 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-46507153-9774-4c6b-ae6d-d2448087cdae tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Acquiring lock "f8609da3-c26d-482a-bc03-017baf4bce22-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:56:07 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-46507153-9774-4c6b-ae6d-d2448087cdae tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Lock "f8609da3-c26d-482a-bc03-017baf4bce22-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:56:07 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-46507153-9774-4c6b-ae6d-d2448087cdae tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Lock "f8609da3-c26d-482a-bc03-017baf4bce22-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:56:07 user nova-compute[70954]: INFO nova.compute.manager [None req-46507153-9774-4c6b-ae6d-d2448087cdae tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Terminating instance Apr 21 10:56:07 user nova-compute[70954]: DEBUG nova.compute.manager [None req-46507153-9774-4c6b-ae6d-d2448087cdae tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Start destroying the instance on the hypervisor. {{(pid=70954) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 10:56:07 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:56:07 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:56:07 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:56:07 user nova-compute[70954]: DEBUG nova.compute.manager [req-6cea336d-c90f-47e0-bf48-571609867391 req-6173f3bb-2943-4b2e-a5d5-80b24b37a85d service nova] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Received event network-vif-unplugged-f210779b-302b-4a17-8b57-07837ea54e12 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:56:07 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-6cea336d-c90f-47e0-bf48-571609867391 req-6173f3bb-2943-4b2e-a5d5-80b24b37a85d service nova] Acquiring lock "f8609da3-c26d-482a-bc03-017baf4bce22-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:56:07 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-6cea336d-c90f-47e0-bf48-571609867391 req-6173f3bb-2943-4b2e-a5d5-80b24b37a85d service nova] Lock "f8609da3-c26d-482a-bc03-017baf4bce22-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:56:07 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-6cea336d-c90f-47e0-bf48-571609867391 req-6173f3bb-2943-4b2e-a5d5-80b24b37a85d service nova] Lock "f8609da3-c26d-482a-bc03-017baf4bce22-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:56:07 user nova-compute[70954]: DEBUG nova.compute.manager [req-6cea336d-c90f-47e0-bf48-571609867391 req-6173f3bb-2943-4b2e-a5d5-80b24b37a85d service nova] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] No waiting events found dispatching network-vif-unplugged-f210779b-302b-4a17-8b57-07837ea54e12 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:56:07 user nova-compute[70954]: DEBUG nova.compute.manager [req-6cea336d-c90f-47e0-bf48-571609867391 req-6173f3bb-2943-4b2e-a5d5-80b24b37a85d service nova] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Received event network-vif-unplugged-f210779b-302b-4a17-8b57-07837ea54e12 for instance with task_state deleting. {{(pid=70954) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 10:56:08 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Instance destroyed successfully. Apr 21 10:56:08 user nova-compute[70954]: DEBUG nova.objects.instance [None req-46507153-9774-4c6b-ae6d-d2448087cdae tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Lazy-loading 'resources' on Instance uuid f8609da3-c26d-482a-bc03-017baf4bce22 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:56:08 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-46507153-9774-4c6b-ae6d-d2448087cdae tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:47:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1307788712',display_name='tempest-ServerActionsTestJSON-server-1307788712',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serveractionstestjson-server-1307788712',id=3,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP17pDhSIQbCv4xewaSR+c65YmMH+hIkRmyXO1jHYq3hmftzXxLb6EXcvZayMHXJMHoDUOwUfoaQ/r3kME39pIqEI1cveoujwBV7i5jBCcTH71kCrlaE9KNWPqoT9mc/lQ==',key_name='tempest-keypair-1735824251',keypairs=,launch_index=0,launched_at=2023-04-21T10:47:30Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='648163a728fc4b28b85a24e9198d356b',ramdisk_id='',reservation_id='r-0qt8u06d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerActionsTestJSON-1614287361',owner_user_name='tempest-ServerActionsTestJSON-1614287361-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T10:47:30Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ced216baa4a64c72946cf3f71eb873dd',uuid=f8609da3-c26d-482a-bc03-017baf4bce22,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f210779b-302b-4a17-8b57-07837ea54e12", "address": "fa:16:3e:c3:c6:d1", "network": {"id": "ba8e9ff2-e562-462e-a2fa-0d7f643da26c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-83296950-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.39", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "648163a728fc4b28b85a24e9198d356b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf210779b-30", "ovs_interfaceid": "f210779b-302b-4a17-8b57-07837ea54e12", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 10:56:08 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-46507153-9774-4c6b-ae6d-d2448087cdae tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Converting VIF {"id": "f210779b-302b-4a17-8b57-07837ea54e12", "address": "fa:16:3e:c3:c6:d1", "network": {"id": "ba8e9ff2-e562-462e-a2fa-0d7f643da26c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-83296950-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.39", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "648163a728fc4b28b85a24e9198d356b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf210779b-30", "ovs_interfaceid": "f210779b-302b-4a17-8b57-07837ea54e12", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:56:08 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-46507153-9774-4c6b-ae6d-d2448087cdae tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c3:c6:d1,bridge_name='br-int',has_traffic_filtering=True,id=f210779b-302b-4a17-8b57-07837ea54e12,network=Network(ba8e9ff2-e562-462e-a2fa-0d7f643da26c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf210779b-30') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:56:08 user nova-compute[70954]: DEBUG os_vif [None req-46507153-9774-4c6b-ae6d-d2448087cdae tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c3:c6:d1,bridge_name='br-int',has_traffic_filtering=True,id=f210779b-302b-4a17-8b57-07837ea54e12,network=Network(ba8e9ff2-e562-462e-a2fa-0d7f643da26c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf210779b-30') {{(pid=70954) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 10:56:08 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:56:08 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf210779b-30, bridge=br-int, if_exists=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:56:08 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:56:08 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:56:08 user nova-compute[70954]: INFO os_vif [None req-46507153-9774-4c6b-ae6d-d2448087cdae tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c3:c6:d1,bridge_name='br-int',has_traffic_filtering=True,id=f210779b-302b-4a17-8b57-07837ea54e12,network=Network(ba8e9ff2-e562-462e-a2fa-0d7f643da26c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf210779b-30') Apr 21 10:56:08 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-46507153-9774-4c6b-ae6d-d2448087cdae tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Deleting instance files /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22_del Apr 21 10:56:08 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-46507153-9774-4c6b-ae6d-d2448087cdae tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Deletion of /opt/stack/data/nova/instances/f8609da3-c26d-482a-bc03-017baf4bce22_del complete Apr 21 10:56:08 user nova-compute[70954]: INFO nova.compute.manager [None req-46507153-9774-4c6b-ae6d-d2448087cdae tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Took 0.85 seconds to destroy the instance on the hypervisor. Apr 21 10:56:08 user nova-compute[70954]: DEBUG oslo.service.loopingcall [None req-46507153-9774-4c6b-ae6d-d2448087cdae tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70954) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 10:56:08 user nova-compute[70954]: DEBUG nova.compute.manager [-] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Deallocating network for instance {{(pid=70954) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 10:56:08 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] deallocate_for_instance() {{(pid=70954) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 10:56:08 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Updating instance_info_cache with network_info: [] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:56:08 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Took 0.79 seconds to deallocate network for instance. Apr 21 10:56:08 user nova-compute[70954]: DEBUG nova.compute.manager [req-567bd3e5-3be5-478f-a15e-9e8e3fc8686d req-56ac6ece-1362-49ae-be10-5c05a1312633 service nova] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Received event network-vif-deleted-f210779b-302b-4a17-8b57-07837ea54e12 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:56:09 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-46507153-9774-4c6b-ae6d-d2448087cdae tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:56:09 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-46507153-9774-4c6b-ae6d-d2448087cdae tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:56:09 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-46507153-9774-4c6b-ae6d-d2448087cdae tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:56:09 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-46507153-9774-4c6b-ae6d-d2448087cdae tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:56:09 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-46507153-9774-4c6b-ae6d-d2448087cdae tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.197s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:56:09 user nova-compute[70954]: INFO nova.scheduler.client.report [None req-46507153-9774-4c6b-ae6d-d2448087cdae tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Deleted allocations for instance f8609da3-c26d-482a-bc03-017baf4bce22 Apr 21 10:56:09 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-46507153-9774-4c6b-ae6d-d2448087cdae tempest-ServerActionsTestJSON-1614287361 tempest-ServerActionsTestJSON-1614287361-project-member] Lock "f8609da3-c26d-482a-bc03-017baf4bce22" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.019s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:56:09 user nova-compute[70954]: DEBUG nova.compute.manager [req-4f8a6846-ea8b-4893-a907-bb850099c5ea req-4a5312ed-e9c7-467d-8626-faded27349a4 service nova] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Received event network-vif-plugged-f210779b-302b-4a17-8b57-07837ea54e12 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:56:09 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-4f8a6846-ea8b-4893-a907-bb850099c5ea req-4a5312ed-e9c7-467d-8626-faded27349a4 service nova] Acquiring lock "f8609da3-c26d-482a-bc03-017baf4bce22-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:56:09 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-4f8a6846-ea8b-4893-a907-bb850099c5ea req-4a5312ed-e9c7-467d-8626-faded27349a4 service nova] Lock "f8609da3-c26d-482a-bc03-017baf4bce22-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:56:09 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-4f8a6846-ea8b-4893-a907-bb850099c5ea req-4a5312ed-e9c7-467d-8626-faded27349a4 service nova] Lock "f8609da3-c26d-482a-bc03-017baf4bce22-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:56:09 user nova-compute[70954]: DEBUG nova.compute.manager [req-4f8a6846-ea8b-4893-a907-bb850099c5ea req-4a5312ed-e9c7-467d-8626-faded27349a4 service nova] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] No waiting events found dispatching network-vif-plugged-f210779b-302b-4a17-8b57-07837ea54e12 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:56:09 user nova-compute[70954]: WARNING nova.compute.manager [req-4f8a6846-ea8b-4893-a907-bb850099c5ea req-4a5312ed-e9c7-467d-8626-faded27349a4 service nova] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Received unexpected event network-vif-plugged-f210779b-302b-4a17-8b57-07837ea54e12 for instance with vm_state deleted and task_state None. Apr 21 10:56:10 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:56:10 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:56:12 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:56:13 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:56:14 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:56:14 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70954) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 10:56:17 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:56:18 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:56:23 user nova-compute[70954]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:56:23 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] VM Stopped (Lifecycle Event) Apr 21 10:56:23 user nova-compute[70954]: DEBUG nova.compute.manager [None req-dbcaa770-c365-4d63-94d3-bd604791e0f3 None None] [instance: f8609da3-c26d-482a-bc03-017baf4bce22] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:56:23 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:56:23 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:56:23 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=70954) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 21 10:56:23 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 10:56:23 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 10:56:23 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:56:25 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:56:25 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:56:27 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:56:28 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:56:31 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:56:33 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:56:34 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:56:35 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Acquiring lock "595d41a4-9a01-4aa2-96a1-c2c763475184" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:56:35 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Lock "595d41a4-9a01-4aa2-96a1-c2c763475184" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:56:35 user nova-compute[70954]: DEBUG nova.compute.manager [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Starting instance... {{(pid=70954) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 10:56:35 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:56:35 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:56:35 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70954) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 10:56:35 user nova-compute[70954]: INFO nova.compute.claims [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Claim successful on node user Apr 21 10:56:35 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:56:35 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:56:35 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.200s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:56:35 user nova-compute[70954]: DEBUG nova.compute.manager [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Start building networks asynchronously for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 10:56:35 user nova-compute[70954]: DEBUG nova.compute.manager [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Allocating IP information in the background. {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 10:56:35 user nova-compute[70954]: DEBUG nova.network.neutron [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] allocate_for_instance() {{(pid=70954) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 10:56:35 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 10:56:35 user nova-compute[70954]: DEBUG nova.compute.manager [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Start building block device mappings for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 10:56:35 user nova-compute[70954]: DEBUG nova.policy [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '95956d2e4ea84534b6d5628eb8dd184d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'acc12d15daf34c5e9d26a6cc53795efe', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70954) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 10:56:35 user nova-compute[70954]: DEBUG nova.compute.manager [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Start spawning the instance on the hypervisor. {{(pid=70954) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 10:56:35 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Creating instance directory {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 10:56:35 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Creating image(s) Apr 21 10:56:35 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Acquiring lock "/opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:56:35 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Lock "/opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:56:35 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Lock "/opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:56:35 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:56:36 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.136s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:56:36 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Acquiring lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:56:36 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:56:36 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:56:36 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.138s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:56:36 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184/disk 1073741824 {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:56:36 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184/disk 1073741824" returned: 0 in 0.053s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:56:36 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.197s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:56:36 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:56:36 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.141s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:56:36 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Checking if we can resize image /opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184/disk. size=1073741824 {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 10:56:36 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:56:36 user nova-compute[70954]: DEBUG nova.network.neutron [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Successfully created port: eb0b0125-965b-4825-aab1-3ba81be44c2f {{(pid=70954) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 10:56:36 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:56:36 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Cannot resize image /opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184/disk to a smaller size. {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 10:56:36 user nova-compute[70954]: DEBUG nova.objects.instance [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Lazy-loading 'migration_context' on Instance uuid 595d41a4-9a01-4aa2-96a1-c2c763475184 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:56:36 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Created local disks {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 10:56:36 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Ensure instance console log exists: /opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184/console.log {{(pid=70954) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 10:56:36 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:56:36 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:56:36 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG nova.network.neutron [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Successfully updated port: eb0b0125-965b-4825-aab1-3ba81be44c2f {{(pid=70954) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Acquiring lock "refresh_cache-595d41a4-9a01-4aa2-96a1-c2c763475184" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Acquired lock "refresh_cache-595d41a4-9a01-4aa2-96a1-c2c763475184" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG nova.network.neutron [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Building network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG nova.compute.manager [req-25e5f47c-9f3f-47c7-b694-bc5617756697 req-537c5212-a301-4dd7-b881-178d9baaa154 service nova] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Received event network-changed-eb0b0125-965b-4825-aab1-3ba81be44c2f {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG nova.compute.manager [req-25e5f47c-9f3f-47c7-b694-bc5617756697 req-537c5212-a301-4dd7-b881-178d9baaa154 service nova] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Refreshing instance network info cache due to event network-changed-eb0b0125-965b-4825-aab1-3ba81be44c2f. {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-25e5f47c-9f3f-47c7-b694-bc5617756697 req-537c5212-a301-4dd7-b881-178d9baaa154 service nova] Acquiring lock "refresh_cache-595d41a4-9a01-4aa2-96a1-c2c763475184" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG nova.network.neutron [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Instance cache missing network info. {{(pid=70954) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG nova.network.neutron [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Updating instance_info_cache with network_info: [{"id": "eb0b0125-965b-4825-aab1-3ba81be44c2f", "address": "fa:16:3e:13:39:c1", "network": {"id": "3e633eed-7c28-4111-849c-3ab0f46c0c5c", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1483635329-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "acc12d15daf34c5e9d26a6cc53795efe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb0b0125-96", "ovs_interfaceid": "eb0b0125-965b-4825-aab1-3ba81be44c2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Releasing lock "refresh_cache-595d41a4-9a01-4aa2-96a1-c2c763475184" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG nova.compute.manager [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Instance network_info: |[{"id": "eb0b0125-965b-4825-aab1-3ba81be44c2f", "address": "fa:16:3e:13:39:c1", "network": {"id": "3e633eed-7c28-4111-849c-3ab0f46c0c5c", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1483635329-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "acc12d15daf34c5e9d26a6cc53795efe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb0b0125-96", "ovs_interfaceid": "eb0b0125-965b-4825-aab1-3ba81be44c2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-25e5f47c-9f3f-47c7-b694-bc5617756697 req-537c5212-a301-4dd7-b881-178d9baaa154 service nova] Acquired lock "refresh_cache-595d41a4-9a01-4aa2-96a1-c2c763475184" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG nova.network.neutron [req-25e5f47c-9f3f-47c7-b694-bc5617756697 req-537c5212-a301-4dd7-b881-178d9baaa154 service nova] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Refreshing network info cache for port eb0b0125-965b-4825-aab1-3ba81be44c2f {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Start _get_guest_xml network_info=[{"id": "eb0b0125-965b-4825-aab1-3ba81be44c2f", "address": "fa:16:3e:13:39:c1", "network": {"id": "3e633eed-7c28-4111-849c-3ab0f46c0c5c", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1483635329-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "acc12d15daf34c5e9d26a6cc53795efe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb0b0125-96", "ovs_interfaceid": "eb0b0125-965b-4825-aab1-3ba81be44c2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:43:25Z,direct_url=,disk_format='qcow2',id=3b29a01a-1fc0-4d0d-89fb-23d22b2de02e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='a3109aa78f014d0da3638064a889676d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:43:26Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'size': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'disk_bus': 'virtio', 'encryption_format': None, 'guest_format': None, 'image_id': '3b29a01a-1fc0-4d0d-89fb-23d22b2de02e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 10:56:37 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:56:37 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:56:37 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70954) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T10:44:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:43:25Z,direct_url=,disk_format='qcow2',id=3b29a01a-1fc0-4d0d-89fb-23d22b2de02e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='a3109aa78f014d0da3638064a889676d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:43:26Z,virtual_size=,visibility=), allow threads: True {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Flavor limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Image limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Flavor pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Image pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Got 1 possible topologies {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:56:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-2123160889',display_name='tempest-ServersNegativeTestJSON-server-2123160889',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-2123160889',id=19,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='acc12d15daf34c5e9d26a6cc53795efe',ramdisk_id='',reservation_id='r-1dp2q30u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1265836442',owner_user_name='tempest-ServersNegativeTestJSON-1265836442-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:56:36Z,user_data=None,user_id='95956d2e4ea84534b6d5628eb8dd184d',uuid=595d41a4-9a01-4aa2-96a1-c2c763475184,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "eb0b0125-965b-4825-aab1-3ba81be44c2f", "address": "fa:16:3e:13:39:c1", "network": {"id": "3e633eed-7c28-4111-849c-3ab0f46c0c5c", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1483635329-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "acc12d15daf34c5e9d26a6cc53795efe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb0b0125-96", "ovs_interfaceid": "eb0b0125-965b-4825-aab1-3ba81be44c2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70954) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Converting VIF {"id": "eb0b0125-965b-4825-aab1-3ba81be44c2f", "address": "fa:16:3e:13:39:c1", "network": {"id": "3e633eed-7c28-4111-849c-3ab0f46c0c5c", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1483635329-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "acc12d15daf34c5e9d26a6cc53795efe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb0b0125-96", "ovs_interfaceid": "eb0b0125-965b-4825-aab1-3ba81be44c2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:13:39:c1,bridge_name='br-int',has_traffic_filtering=True,id=eb0b0125-965b-4825-aab1-3ba81be44c2f,network=Network(3e633eed-7c28-4111-849c-3ab0f46c0c5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeb0b0125-96') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG nova.objects.instance [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Lazy-loading 'pci_devices' on Instance uuid 595d41a4-9a01-4aa2-96a1-c2c763475184 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] End _get_guest_xml xml= Apr 21 10:56:37 user nova-compute[70954]: 595d41a4-9a01-4aa2-96a1-c2c763475184 Apr 21 10:56:37 user nova-compute[70954]: instance-00000013 Apr 21 10:56:37 user nova-compute[70954]: 131072 Apr 21 10:56:37 user nova-compute[70954]: 1 Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: tempest-ServersNegativeTestJSON-server-2123160889 Apr 21 10:56:37 user nova-compute[70954]: 2023-04-21 10:56:37 Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: 128 Apr 21 10:56:37 user nova-compute[70954]: 1 Apr 21 10:56:37 user nova-compute[70954]: 0 Apr 21 10:56:37 user nova-compute[70954]: 0 Apr 21 10:56:37 user nova-compute[70954]: 1 Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: tempest-ServersNegativeTestJSON-1265836442-project-member Apr 21 10:56:37 user nova-compute[70954]: tempest-ServersNegativeTestJSON-1265836442 Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: OpenStack Foundation Apr 21 10:56:37 user nova-compute[70954]: OpenStack Nova Apr 21 10:56:37 user nova-compute[70954]: 0.0.0 Apr 21 10:56:37 user nova-compute[70954]: 595d41a4-9a01-4aa2-96a1-c2c763475184 Apr 21 10:56:37 user nova-compute[70954]: 595d41a4-9a01-4aa2-96a1-c2c763475184 Apr 21 10:56:37 user nova-compute[70954]: Virtual Machine Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: hvm Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Nehalem Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: /dev/urandom Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: Apr 21 10:56:37 user nova-compute[70954]: {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:56:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-2123160889',display_name='tempest-ServersNegativeTestJSON-server-2123160889',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-2123160889',id=19,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='acc12d15daf34c5e9d26a6cc53795efe',ramdisk_id='',reservation_id='r-1dp2q30u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1265836442',owner_user_name='tempest-ServersNegativeTestJSON-1265836442-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:56:36Z,user_data=None,user_id='95956d2e4ea84534b6d5628eb8dd184d',uuid=595d41a4-9a01-4aa2-96a1-c2c763475184,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "eb0b0125-965b-4825-aab1-3ba81be44c2f", "address": "fa:16:3e:13:39:c1", "network": {"id": "3e633eed-7c28-4111-849c-3ab0f46c0c5c", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1483635329-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "acc12d15daf34c5e9d26a6cc53795efe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb0b0125-96", "ovs_interfaceid": "eb0b0125-965b-4825-aab1-3ba81be44c2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Converting VIF {"id": "eb0b0125-965b-4825-aab1-3ba81be44c2f", "address": "fa:16:3e:13:39:c1", "network": {"id": "3e633eed-7c28-4111-849c-3ab0f46c0c5c", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1483635329-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "acc12d15daf34c5e9d26a6cc53795efe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb0b0125-96", "ovs_interfaceid": "eb0b0125-965b-4825-aab1-3ba81be44c2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:13:39:c1,bridge_name='br-int',has_traffic_filtering=True,id=eb0b0125-965b-4825-aab1-3ba81be44c2f,network=Network(3e633eed-7c28-4111-849c-3ab0f46c0c5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeb0b0125-96') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG os_vif [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:13:39:c1,bridge_name='br-int',has_traffic_filtering=True,id=eb0b0125-965b-4825-aab1-3ba81be44c2f,network=Network(3e633eed-7c28-4111-849c-3ab0f46c0c5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeb0b0125-96') {{(pid=70954) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeb0b0125-96, may_exist=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapeb0b0125-96, col_values=(('external_ids', {'iface-id': 'eb0b0125-965b-4825-aab1-3ba81be44c2f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:13:39:c1', 'vm-uuid': '595d41a4-9a01-4aa2-96a1-c2c763475184'}),)) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:56:37 user nova-compute[70954]: INFO os_vif [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:13:39:c1,bridge_name='br-int',has_traffic_filtering=True,id=eb0b0125-965b-4825-aab1-3ba81be44c2f,network=Network(3e633eed-7c28-4111-849c-3ab0f46c0c5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeb0b0125-96') Apr 21 10:56:37 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] No BDM found with device name vda, not building metadata. {{(pid=70954) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 10:56:37 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] No VIF found with MAC fa:16:3e:13:39:c1, not building metadata {{(pid=70954) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 10:56:38 user nova-compute[70954]: DEBUG nova.network.neutron [req-25e5f47c-9f3f-47c7-b694-bc5617756697 req-537c5212-a301-4dd7-b881-178d9baaa154 service nova] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Updated VIF entry in instance network info cache for port eb0b0125-965b-4825-aab1-3ba81be44c2f. {{(pid=70954) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 10:56:38 user nova-compute[70954]: DEBUG nova.network.neutron [req-25e5f47c-9f3f-47c7-b694-bc5617756697 req-537c5212-a301-4dd7-b881-178d9baaa154 service nova] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Updating instance_info_cache with network_info: [{"id": "eb0b0125-965b-4825-aab1-3ba81be44c2f", "address": "fa:16:3e:13:39:c1", "network": {"id": "3e633eed-7c28-4111-849c-3ab0f46c0c5c", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1483635329-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "acc12d15daf34c5e9d26a6cc53795efe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb0b0125-96", "ovs_interfaceid": "eb0b0125-965b-4825-aab1-3ba81be44c2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:56:38 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-25e5f47c-9f3f-47c7-b694-bc5617756697 req-537c5212-a301-4dd7-b881-178d9baaa154 service nova] Releasing lock "refresh_cache-595d41a4-9a01-4aa2-96a1-c2c763475184" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:56:38 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:56:39 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:56:39 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:56:39 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:56:39 user nova-compute[70954]: DEBUG nova.compute.manager [req-276b9752-4df5-4a73-b49b-63f7ece5ac89 req-1ff1e6fc-979d-449d-87d4-b3bb5a462a0a service nova] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Received event network-vif-plugged-eb0b0125-965b-4825-aab1-3ba81be44c2f {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:56:39 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-276b9752-4df5-4a73-b49b-63f7ece5ac89 req-1ff1e6fc-979d-449d-87d4-b3bb5a462a0a service nova] Acquiring lock "595d41a4-9a01-4aa2-96a1-c2c763475184-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:56:39 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-276b9752-4df5-4a73-b49b-63f7ece5ac89 req-1ff1e6fc-979d-449d-87d4-b3bb5a462a0a service nova] Lock "595d41a4-9a01-4aa2-96a1-c2c763475184-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:56:39 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-276b9752-4df5-4a73-b49b-63f7ece5ac89 req-1ff1e6fc-979d-449d-87d4-b3bb5a462a0a service nova] Lock "595d41a4-9a01-4aa2-96a1-c2c763475184-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:56:39 user nova-compute[70954]: DEBUG nova.compute.manager [req-276b9752-4df5-4a73-b49b-63f7ece5ac89 req-1ff1e6fc-979d-449d-87d4-b3bb5a462a0a service nova] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] No waiting events found dispatching network-vif-plugged-eb0b0125-965b-4825-aab1-3ba81be44c2f {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:56:39 user nova-compute[70954]: WARNING nova.compute.manager [req-276b9752-4df5-4a73-b49b-63f7ece5ac89 req-1ff1e6fc-979d-449d-87d4-b3bb5a462a0a service nova] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Received unexpected event network-vif-plugged-eb0b0125-965b-4825-aab1-3ba81be44c2f for instance with vm_state building and task_state spawning. Apr 21 10:56:39 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:56:39 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:56:39 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:56:41 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Resumed> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:56:41 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] VM Resumed (Lifecycle Event) Apr 21 10:56:41 user nova-compute[70954]: DEBUG nova.compute.manager [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Instance event wait completed in 0 seconds for {{(pid=70954) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 10:56:41 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Guest created on hypervisor {{(pid=70954) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 10:56:41 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Instance spawned successfully. Apr 21 10:56:41 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 10:56:41 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:56:41 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:56:41 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Found default for hw_cdrom_bus of ide {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:56:41 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Found default for hw_disk_bus of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:56:41 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Found default for hw_input_bus of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:56:41 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Found default for hw_pointer_model of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:56:41 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Found default for hw_video_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:56:41 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Found default for hw_vif_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:56:41 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 10:56:41 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Started> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:56:41 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] VM Started (Lifecycle Event) Apr 21 10:56:41 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:56:41 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:56:41 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 10:56:41 user nova-compute[70954]: INFO nova.compute.manager [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Took 5.41 seconds to spawn the instance on the hypervisor. Apr 21 10:56:41 user nova-compute[70954]: DEBUG nova.compute.manager [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:56:41 user nova-compute[70954]: INFO nova.compute.manager [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Took 5.92 seconds to build instance. Apr 21 10:56:41 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-a82dd866-9e37-47da-8db9-c3e084848b4a tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Lock "595d41a4-9a01-4aa2-96a1-c2c763475184" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.013s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:56:41 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:56:41 user nova-compute[70954]: DEBUG nova.compute.manager [req-d94b905a-75d9-48f8-b75c-a2bf059833bd req-ad2ddf72-7c5a-46b3-b180-5be326230140 service nova] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Received event network-vif-plugged-eb0b0125-965b-4825-aab1-3ba81be44c2f {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:56:41 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-d94b905a-75d9-48f8-b75c-a2bf059833bd req-ad2ddf72-7c5a-46b3-b180-5be326230140 service nova] Acquiring lock "595d41a4-9a01-4aa2-96a1-c2c763475184-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:56:41 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-d94b905a-75d9-48f8-b75c-a2bf059833bd req-ad2ddf72-7c5a-46b3-b180-5be326230140 service nova] Lock "595d41a4-9a01-4aa2-96a1-c2c763475184-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:56:41 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-d94b905a-75d9-48f8-b75c-a2bf059833bd req-ad2ddf72-7c5a-46b3-b180-5be326230140 service nova] Lock "595d41a4-9a01-4aa2-96a1-c2c763475184-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:56:41 user nova-compute[70954]: DEBUG nova.compute.manager [req-d94b905a-75d9-48f8-b75c-a2bf059833bd req-ad2ddf72-7c5a-46b3-b180-5be326230140 service nova] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] No waiting events found dispatching network-vif-plugged-eb0b0125-965b-4825-aab1-3ba81be44c2f {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:56:41 user nova-compute[70954]: WARNING nova.compute.manager [req-d94b905a-75d9-48f8-b75c-a2bf059833bd req-ad2ddf72-7c5a-46b3-b180-5be326230140 service nova] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Received unexpected event network-vif-plugged-eb0b0125-965b-4825-aab1-3ba81be44c2f for instance with vm_state active and task_state None. Apr 21 10:56:42 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:56:43 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:56:44 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:56:47 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:56:48 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:56:52 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:56:53 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:56:57 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:56:58 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:00 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:02 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:03 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:03 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:57:03 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Starting heal instance info cache {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 10:57:03 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "refresh_cache-c70df604-601e-4451-828d-20c649b6052a" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:57:03 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquired lock "refresh_cache-c70df604-601e-4451-828d-20c649b6052a" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:57:03 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: c70df604-601e-4451-828d-20c649b6052a] Forcefully refreshing network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 21 10:57:04 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: c70df604-601e-4451-828d-20c649b6052a] Updating instance_info_cache with network_info: [{"id": "5751ca80-5041-4999-b832-b427fb0af8a2", "address": "fa:16:3e:5f:74:cd", "network": {"id": "e0ccd2d9-69df-40e0-be8e-8328039f1bd0", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-587901453-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "14bc6b0c20204c8287b3523814007856", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5751ca80-50", "ovs_interfaceid": "5751ca80-5041-4999-b832-b427fb0af8a2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:57:04 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Releasing lock "refresh_cache-c70df604-601e-4451-828d-20c649b6052a" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:57:04 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: c70df604-601e-4451-828d-20c649b6052a] Updated the network info_cache for instance {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 21 10:57:05 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:57:06 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:06 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:57:06 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:57:06 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:57:06 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager.update_available_resource {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:57:06 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:57:06 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:57:06 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:57:06 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Auditing locally available compute resources for user (node: user) {{(pid=70954) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 10:57:06 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:57:07 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184/disk --force-share --output=json" returned: 0 in 0.154s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:57:07 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:57:07 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184/disk --force-share --output=json" returned: 0 in 0.153s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:57:07 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c70df604-601e-4451-828d-20c649b6052a/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:57:07 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c70df604-601e-4451-828d-20c649b6052a/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:57:07 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c70df604-601e-4451-828d-20c649b6052a/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:57:07 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c70df604-601e-4451-828d-20c649b6052a/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:57:07 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:07 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:57:07 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:57:07 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Hypervisor/Node resource view: name=user free_ram=8925MB free_disk=26.544395446777344GB free_vcpus=10 pci_devices=[{"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70954) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 10:57:07 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:57:07 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:57:08 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance c70df604-601e-4451-828d-20c649b6052a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:57:08 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 595d41a4-9a01-4aa2-96a1-c2c763475184 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:57:08 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Total usable vcpus: 12, total allocated vcpus: 2 {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 10:57:08 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Final resource view: name=user phys_ram=16023MB used_ram=768MB phys_disk=40GB used_disk=2GB total_vcpus=12 used_vcpus=2 pci_stats=[] {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 10:57:08 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:08 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:57:08 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:57:08 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Compute_service record updated for user:user {{(pid=70954) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 10:57:08 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.231s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:57:08 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:10 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Acquiring lock "476dbf2e-b02a-47bc-a8c6-6d0d66d5d433" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:57:10 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lock "476dbf2e-b02a-47bc-a8c6-6d0d66d5d433" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:57:10 user nova-compute[70954]: DEBUG nova.compute.manager [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Starting instance... {{(pid=70954) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 10:57:10 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:57:10 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:57:10 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70954) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 10:57:10 user nova-compute[70954]: INFO nova.compute.claims [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Claim successful on node user Apr 21 10:57:10 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:57:10 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:57:10 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.235s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:57:10 user nova-compute[70954]: DEBUG nova.compute.manager [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Start building networks asynchronously for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 10:57:10 user nova-compute[70954]: DEBUG nova.compute.manager [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Allocating IP information in the background. {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 10:57:10 user nova-compute[70954]: DEBUG nova.network.neutron [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] allocate_for_instance() {{(pid=70954) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 10:57:10 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 10:57:10 user nova-compute[70954]: DEBUG nova.compute.manager [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Start building block device mappings for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 10:57:10 user nova-compute[70954]: DEBUG nova.compute.manager [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Start spawning the instance on the hypervisor. {{(pid=70954) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 10:57:10 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Creating instance directory {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 10:57:10 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Creating image(s) Apr 21 10:57:10 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Acquiring lock "/opt/stack/data/nova/instances/476dbf2e-b02a-47bc-a8c6-6d0d66d5d433/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:57:10 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lock "/opt/stack/data/nova/instances/476dbf2e-b02a-47bc-a8c6-6d0d66d5d433/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:57:10 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lock "/opt/stack/data/nova/instances/476dbf2e-b02a-47bc-a8c6-6d0d66d5d433/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:57:10 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:57:10 user nova-compute[70954]: DEBUG nova.policy [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0f73ac02062c4411bde0c97f6a719926', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c0a611b8a8d54522929c37807054b2f6', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70954) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 10:57:10 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.150s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:57:10 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Acquiring lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:57:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:57:11 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:57:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Acquiring lock "d1fca309-1d26-4a34-b932-716064b86b00" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:57:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lock "d1fca309-1d26-4a34-b932-716064b86b00" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:57:11 user nova-compute[70954]: DEBUG nova.compute.manager [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Starting instance... {{(pid=70954) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 10:57:11 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.145s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:57:11 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/476dbf2e-b02a-47bc-a8c6-6d0d66d5d433/disk 1073741824 {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:57:11 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/476dbf2e-b02a-47bc-a8c6-6d0d66d5d433/disk 1073741824" returned: 0 in 0.052s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:57:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.201s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:57:11 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:57:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:57:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.003s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:57:11 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:57:11 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:57:11 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70954) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 10:57:11 user nova-compute[70954]: INFO nova.compute.claims [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Claim successful on node user Apr 21 10:57:11 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.136s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:57:11 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Checking if we can resize image /opt/stack/data/nova/instances/476dbf2e-b02a-47bc-a8c6-6d0d66d5d433/disk. size=1073741824 {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 10:57:11 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/476dbf2e-b02a-47bc-a8c6-6d0d66d5d433/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:57:11 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/476dbf2e-b02a-47bc-a8c6-6d0d66d5d433/disk --force-share --output=json" returned: 0 in 0.148s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:57:11 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Cannot resize image /opt/stack/data/nova/instances/476dbf2e-b02a-47bc-a8c6-6d0d66d5d433/disk to a smaller size. {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 10:57:11 user nova-compute[70954]: DEBUG nova.objects.instance [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lazy-loading 'migration_context' on Instance uuid 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:57:11 user nova-compute[70954]: DEBUG nova.network.neutron [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Successfully created port: fb82372c-8c1a-43e8-9eba-5f5469b8ac66 {{(pid=70954) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 10:57:11 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Created local disks {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 10:57:11 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Ensure instance console log exists: /opt/stack/data/nova/instances/476dbf2e-b02a-47bc-a8c6-6d0d66d5d433/console.log {{(pid=70954) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 10:57:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:57:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:57:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:57:11 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:57:11 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:57:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.355s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:57:11 user nova-compute[70954]: DEBUG nova.compute.manager [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Start building networks asynchronously for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 10:57:11 user nova-compute[70954]: DEBUG nova.compute.manager [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Allocating IP information in the background. {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 10:57:11 user nova-compute[70954]: DEBUG nova.network.neutron [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] allocate_for_instance() {{(pid=70954) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 10:57:11 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 10:57:11 user nova-compute[70954]: DEBUG nova.compute.manager [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Start building block device mappings for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 10:57:11 user nova-compute[70954]: DEBUG nova.compute.manager [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Start spawning the instance on the hypervisor. {{(pid=70954) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 10:57:11 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Creating instance directory {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 10:57:11 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Creating image(s) Apr 21 10:57:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Acquiring lock "/opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:57:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lock "/opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:57:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lock "/opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:57:11 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:57:11 user nova-compute[70954]: DEBUG nova.policy [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0f73ac02062c4411bde0c97f6a719926', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c0a611b8a8d54522929c37807054b2f6', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70954) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 10:57:11 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.134s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:57:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Acquiring lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:57:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:57:11 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:57:12 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.133s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:57:12 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00/disk 1073741824 {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:57:12 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00/disk 1073741824" returned: 0 in 0.050s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:57:12 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.189s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:57:12 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:57:12 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.142s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:57:12 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Checking if we can resize image /opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00/disk. size=1073741824 {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 10:57:12 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:57:12 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:57:12 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Cannot resize image /opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00/disk to a smaller size. {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 10:57:12 user nova-compute[70954]: DEBUG nova.objects.instance [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lazy-loading 'migration_context' on Instance uuid d1fca309-1d26-4a34-b932-716064b86b00 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:57:12 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Created local disks {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 10:57:12 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Ensure instance console log exists: /opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00/console.log {{(pid=70954) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 10:57:12 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:57:12 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:57:12 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:57:12 user nova-compute[70954]: DEBUG nova.network.neutron [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Successfully updated port: fb82372c-8c1a-43e8-9eba-5f5469b8ac66 {{(pid=70954) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 10:57:12 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Acquiring lock "refresh_cache-476dbf2e-b02a-47bc-a8c6-6d0d66d5d433" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:57:12 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Acquired lock "refresh_cache-476dbf2e-b02a-47bc-a8c6-6d0d66d5d433" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:57:12 user nova-compute[70954]: DEBUG nova.network.neutron [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Building network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 10:57:12 user nova-compute[70954]: DEBUG nova.compute.manager [req-a1d1edcf-51cd-4cc9-92e9-055ec99eea3f req-bbfc9c6b-aa5a-4923-abc5-2d427bd8b9ca service nova] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Received event network-changed-fb82372c-8c1a-43e8-9eba-5f5469b8ac66 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:57:12 user nova-compute[70954]: DEBUG nova.compute.manager [req-a1d1edcf-51cd-4cc9-92e9-055ec99eea3f req-bbfc9c6b-aa5a-4923-abc5-2d427bd8b9ca service nova] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Refreshing instance network info cache due to event network-changed-fb82372c-8c1a-43e8-9eba-5f5469b8ac66. {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 10:57:12 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-a1d1edcf-51cd-4cc9-92e9-055ec99eea3f req-bbfc9c6b-aa5a-4923-abc5-2d427bd8b9ca service nova] Acquiring lock "refresh_cache-476dbf2e-b02a-47bc-a8c6-6d0d66d5d433" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:57:12 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:57:12 user nova-compute[70954]: DEBUG nova.network.neutron [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Instance cache missing network info. {{(pid=70954) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 10:57:12 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:13 user nova-compute[70954]: DEBUG nova.network.neutron [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Successfully created port: 8abc9260-fa02-4915-a056-63262b57e3be {{(pid=70954) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 10:57:13 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:13 user nova-compute[70954]: DEBUG nova.network.neutron [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Updating instance_info_cache with network_info: [{"id": "fb82372c-8c1a-43e8-9eba-5f5469b8ac66", "address": "fa:16:3e:d7:18:9c", "network": {"id": "72777f52-fe61-4f05-b2c6-5edb74fb3138", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1022240465-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c0a611b8a8d54522929c37807054b2f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb82372c-8c", "ovs_interfaceid": "fb82372c-8c1a-43e8-9eba-5f5469b8ac66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:57:13 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Releasing lock "refresh_cache-476dbf2e-b02a-47bc-a8c6-6d0d66d5d433" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:57:13 user nova-compute[70954]: DEBUG nova.compute.manager [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Instance network_info: |[{"id": "fb82372c-8c1a-43e8-9eba-5f5469b8ac66", "address": "fa:16:3e:d7:18:9c", "network": {"id": "72777f52-fe61-4f05-b2c6-5edb74fb3138", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1022240465-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c0a611b8a8d54522929c37807054b2f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb82372c-8c", "ovs_interfaceid": "fb82372c-8c1a-43e8-9eba-5f5469b8ac66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 10:57:13 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-a1d1edcf-51cd-4cc9-92e9-055ec99eea3f req-bbfc9c6b-aa5a-4923-abc5-2d427bd8b9ca service nova] Acquired lock "refresh_cache-476dbf2e-b02a-47bc-a8c6-6d0d66d5d433" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:57:13 user nova-compute[70954]: DEBUG nova.network.neutron [req-a1d1edcf-51cd-4cc9-92e9-055ec99eea3f req-bbfc9c6b-aa5a-4923-abc5-2d427bd8b9ca service nova] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Refreshing network info cache for port fb82372c-8c1a-43e8-9eba-5f5469b8ac66 {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 10:57:13 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Start _get_guest_xml network_info=[{"id": "fb82372c-8c1a-43e8-9eba-5f5469b8ac66", "address": "fa:16:3e:d7:18:9c", "network": {"id": "72777f52-fe61-4f05-b2c6-5edb74fb3138", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1022240465-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c0a611b8a8d54522929c37807054b2f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb82372c-8c", "ovs_interfaceid": "fb82372c-8c1a-43e8-9eba-5f5469b8ac66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:43:25Z,direct_url=,disk_format='qcow2',id=3b29a01a-1fc0-4d0d-89fb-23d22b2de02e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='a3109aa78f014d0da3638064a889676d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:43:26Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'size': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'disk_bus': 'virtio', 'encryption_format': None, 'guest_format': None, 'image_id': '3b29a01a-1fc0-4d0d-89fb-23d22b2de02e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 10:57:13 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:57:13 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:57:13 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70954) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 10:57:13 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T10:44:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:43:25Z,direct_url=,disk_format='qcow2',id=3b29a01a-1fc0-4d0d-89fb-23d22b2de02e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='a3109aa78f014d0da3638064a889676d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:43:26Z,virtual_size=,visibility=), allow threads: True {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 10:57:13 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Flavor limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 10:57:13 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Image limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 10:57:13 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Flavor pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 10:57:13 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Image pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 10:57:13 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 10:57:13 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 10:57:13 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 10:57:13 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Got 1 possible topologies {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 10:57:13 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 10:57:13 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 10:57:13 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:57:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1050316627',display_name='tempest-ServerRescueNegativeTestJSON-server-1050316627',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-1050316627',id=20,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c0a611b8a8d54522929c37807054b2f6',ramdisk_id='',reservation_id='r-i4m218s0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-1656706265',owner_user_name='tempest-ServerRescueNegativeTestJSON-1656706265-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:57:11Z,user_data=None,user_id='0f73ac02062c4411bde0c97f6a719926',uuid=476dbf2e-b02a-47bc-a8c6-6d0d66d5d433,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fb82372c-8c1a-43e8-9eba-5f5469b8ac66", "address": "fa:16:3e:d7:18:9c", "network": {"id": "72777f52-fe61-4f05-b2c6-5edb74fb3138", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1022240465-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c0a611b8a8d54522929c37807054b2f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb82372c-8c", "ovs_interfaceid": "fb82372c-8c1a-43e8-9eba-5f5469b8ac66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70954) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 10:57:13 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Converting VIF {"id": "fb82372c-8c1a-43e8-9eba-5f5469b8ac66", "address": "fa:16:3e:d7:18:9c", "network": {"id": "72777f52-fe61-4f05-b2c6-5edb74fb3138", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1022240465-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c0a611b8a8d54522929c37807054b2f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb82372c-8c", "ovs_interfaceid": "fb82372c-8c1a-43e8-9eba-5f5469b8ac66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:57:13 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d7:18:9c,bridge_name='br-int',has_traffic_filtering=True,id=fb82372c-8c1a-43e8-9eba-5f5469b8ac66,network=Network(72777f52-fe61-4f05-b2c6-5edb74fb3138),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb82372c-8c') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:57:13 user nova-compute[70954]: DEBUG nova.objects.instance [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lazy-loading 'pci_devices' on Instance uuid 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:57:13 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] End _get_guest_xml xml= Apr 21 10:57:13 user nova-compute[70954]: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433 Apr 21 10:57:13 user nova-compute[70954]: instance-00000014 Apr 21 10:57:13 user nova-compute[70954]: 131072 Apr 21 10:57:13 user nova-compute[70954]: 1 Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: tempest-ServerRescueNegativeTestJSON-server-1050316627 Apr 21 10:57:13 user nova-compute[70954]: 2023-04-21 10:57:13 Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: 128 Apr 21 10:57:13 user nova-compute[70954]: 1 Apr 21 10:57:13 user nova-compute[70954]: 0 Apr 21 10:57:13 user nova-compute[70954]: 0 Apr 21 10:57:13 user nova-compute[70954]: 1 Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: tempest-ServerRescueNegativeTestJSON-1656706265-project-member Apr 21 10:57:13 user nova-compute[70954]: tempest-ServerRescueNegativeTestJSON-1656706265 Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: OpenStack Foundation Apr 21 10:57:13 user nova-compute[70954]: OpenStack Nova Apr 21 10:57:13 user nova-compute[70954]: 0.0.0 Apr 21 10:57:13 user nova-compute[70954]: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433 Apr 21 10:57:13 user nova-compute[70954]: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433 Apr 21 10:57:13 user nova-compute[70954]: Virtual Machine Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: hvm Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Nehalem Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: /dev/urandom Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: Apr 21 10:57:13 user nova-compute[70954]: {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 10:57:13 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:57:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1050316627',display_name='tempest-ServerRescueNegativeTestJSON-server-1050316627',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-1050316627',id=20,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c0a611b8a8d54522929c37807054b2f6',ramdisk_id='',reservation_id='r-i4m218s0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-1656706265',owner_user_name='tempest-ServerRescueNegativeTestJSON-1656706265-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:57:11Z,user_data=None,user_id='0f73ac02062c4411bde0c97f6a719926',uuid=476dbf2e-b02a-47bc-a8c6-6d0d66d5d433,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fb82372c-8c1a-43e8-9eba-5f5469b8ac66", "address": "fa:16:3e:d7:18:9c", "network": {"id": "72777f52-fe61-4f05-b2c6-5edb74fb3138", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1022240465-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c0a611b8a8d54522929c37807054b2f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb82372c-8c", "ovs_interfaceid": "fb82372c-8c1a-43e8-9eba-5f5469b8ac66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 10:57:13 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Converting VIF {"id": "fb82372c-8c1a-43e8-9eba-5f5469b8ac66", "address": "fa:16:3e:d7:18:9c", "network": {"id": "72777f52-fe61-4f05-b2c6-5edb74fb3138", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1022240465-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c0a611b8a8d54522929c37807054b2f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb82372c-8c", "ovs_interfaceid": "fb82372c-8c1a-43e8-9eba-5f5469b8ac66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:57:13 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d7:18:9c,bridge_name='br-int',has_traffic_filtering=True,id=fb82372c-8c1a-43e8-9eba-5f5469b8ac66,network=Network(72777f52-fe61-4f05-b2c6-5edb74fb3138),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb82372c-8c') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:57:13 user nova-compute[70954]: DEBUG os_vif [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d7:18:9c,bridge_name='br-int',has_traffic_filtering=True,id=fb82372c-8c1a-43e8-9eba-5f5469b8ac66,network=Network(72777f52-fe61-4f05-b2c6-5edb74fb3138),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb82372c-8c') {{(pid=70954) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 10:57:13 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:13 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:57:13 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 10:57:13 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:13 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfb82372c-8c, may_exist=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:57:13 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfb82372c-8c, col_values=(('external_ids', {'iface-id': 'fb82372c-8c1a-43e8-9eba-5f5469b8ac66', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d7:18:9c', 'vm-uuid': '476dbf2e-b02a-47bc-a8c6-6d0d66d5d433'}),)) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:57:13 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:13 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:57:13 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:13 user nova-compute[70954]: INFO os_vif [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d7:18:9c,bridge_name='br-int',has_traffic_filtering=True,id=fb82372c-8c1a-43e8-9eba-5f5469b8ac66,network=Network(72777f52-fe61-4f05-b2c6-5edb74fb3138),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb82372c-8c') Apr 21 10:57:13 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] No BDM found with device name vda, not building metadata. {{(pid=70954) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 10:57:13 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] No VIF found with MAC fa:16:3e:d7:18:9c, not building metadata {{(pid=70954) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG nova.network.neutron [req-a1d1edcf-51cd-4cc9-92e9-055ec99eea3f req-bbfc9c6b-aa5a-4923-abc5-2d427bd8b9ca service nova] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Updated VIF entry in instance network info cache for port fb82372c-8c1a-43e8-9eba-5f5469b8ac66. {{(pid=70954) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG nova.network.neutron [req-a1d1edcf-51cd-4cc9-92e9-055ec99eea3f req-bbfc9c6b-aa5a-4923-abc5-2d427bd8b9ca service nova] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Updating instance_info_cache with network_info: [{"id": "fb82372c-8c1a-43e8-9eba-5f5469b8ac66", "address": "fa:16:3e:d7:18:9c", "network": {"id": "72777f52-fe61-4f05-b2c6-5edb74fb3138", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1022240465-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c0a611b8a8d54522929c37807054b2f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb82372c-8c", "ovs_interfaceid": "fb82372c-8c1a-43e8-9eba-5f5469b8ac66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-a1d1edcf-51cd-4cc9-92e9-055ec99eea3f req-bbfc9c6b-aa5a-4923-abc5-2d427bd8b9ca service nova] Releasing lock "refresh_cache-476dbf2e-b02a-47bc-a8c6-6d0d66d5d433" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG nova.network.neutron [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Successfully updated port: 8abc9260-fa02-4915-a056-63262b57e3be {{(pid=70954) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Acquiring lock "refresh_cache-d1fca309-1d26-4a34-b932-716064b86b00" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Acquired lock "refresh_cache-d1fca309-1d26-4a34-b932-716064b86b00" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG nova.network.neutron [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Building network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG nova.network.neutron [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Instance cache missing network info. {{(pid=70954) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG nova.network.neutron [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Updating instance_info_cache with network_info: [{"id": "8abc9260-fa02-4915-a056-63262b57e3be", "address": "fa:16:3e:8d:97:64", "network": {"id": "72777f52-fe61-4f05-b2c6-5edb74fb3138", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1022240465-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c0a611b8a8d54522929c37807054b2f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8abc9260-fa", "ovs_interfaceid": "8abc9260-fa02-4915-a056-63262b57e3be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Releasing lock "refresh_cache-d1fca309-1d26-4a34-b932-716064b86b00" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG nova.compute.manager [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Instance network_info: |[{"id": "8abc9260-fa02-4915-a056-63262b57e3be", "address": "fa:16:3e:8d:97:64", "network": {"id": "72777f52-fe61-4f05-b2c6-5edb74fb3138", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1022240465-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c0a611b8a8d54522929c37807054b2f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8abc9260-fa", "ovs_interfaceid": "8abc9260-fa02-4915-a056-63262b57e3be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Start _get_guest_xml network_info=[{"id": "8abc9260-fa02-4915-a056-63262b57e3be", "address": "fa:16:3e:8d:97:64", "network": {"id": "72777f52-fe61-4f05-b2c6-5edb74fb3138", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1022240465-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c0a611b8a8d54522929c37807054b2f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8abc9260-fa", "ovs_interfaceid": "8abc9260-fa02-4915-a056-63262b57e3be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:43:25Z,direct_url=,disk_format='qcow2',id=3b29a01a-1fc0-4d0d-89fb-23d22b2de02e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='a3109aa78f014d0da3638064a889676d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:43:26Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'size': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'disk_bus': 'virtio', 'encryption_format': None, 'guest_format': None, 'image_id': '3b29a01a-1fc0-4d0d-89fb-23d22b2de02e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 10:57:14 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:57:14 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:57:14 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70954) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T10:44:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:43:25Z,direct_url=,disk_format='qcow2',id=3b29a01a-1fc0-4d0d-89fb-23d22b2de02e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='a3109aa78f014d0da3638064a889676d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:43:26Z,virtual_size=,visibility=), allow threads: True {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Flavor limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Image limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Flavor pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Image pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Got 1 possible topologies {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:57:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-6822012',display_name='tempest-ServerRescueNegativeTestJSON-server-6822012',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-6822012',id=21,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c0a611b8a8d54522929c37807054b2f6',ramdisk_id='',reservation_id='r-cfw0zm0m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-1656706265',owner_user_name='tempest-ServerRescueNegativeTestJSON-1656706265-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:57:12Z,user_data=None,user_id='0f73ac02062c4411bde0c97f6a719926',uuid=d1fca309-1d26-4a34-b932-716064b86b00,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8abc9260-fa02-4915-a056-63262b57e3be", "address": "fa:16:3e:8d:97:64", "network": {"id": "72777f52-fe61-4f05-b2c6-5edb74fb3138", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1022240465-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c0a611b8a8d54522929c37807054b2f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8abc9260-fa", "ovs_interfaceid": "8abc9260-fa02-4915-a056-63262b57e3be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70954) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Converting VIF {"id": "8abc9260-fa02-4915-a056-63262b57e3be", "address": "fa:16:3e:8d:97:64", "network": {"id": "72777f52-fe61-4f05-b2c6-5edb74fb3138", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1022240465-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c0a611b8a8d54522929c37807054b2f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8abc9260-fa", "ovs_interfaceid": "8abc9260-fa02-4915-a056-63262b57e3be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8d:97:64,bridge_name='br-int',has_traffic_filtering=True,id=8abc9260-fa02-4915-a056-63262b57e3be,network=Network(72777f52-fe61-4f05-b2c6-5edb74fb3138),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8abc9260-fa') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG nova.objects.instance [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lazy-loading 'pci_devices' on Instance uuid d1fca309-1d26-4a34-b932-716064b86b00 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] End _get_guest_xml xml= Apr 21 10:57:14 user nova-compute[70954]: d1fca309-1d26-4a34-b932-716064b86b00 Apr 21 10:57:14 user nova-compute[70954]: instance-00000015 Apr 21 10:57:14 user nova-compute[70954]: 131072 Apr 21 10:57:14 user nova-compute[70954]: 1 Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: tempest-ServerRescueNegativeTestJSON-server-6822012 Apr 21 10:57:14 user nova-compute[70954]: 2023-04-21 10:57:14 Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: 128 Apr 21 10:57:14 user nova-compute[70954]: 1 Apr 21 10:57:14 user nova-compute[70954]: 0 Apr 21 10:57:14 user nova-compute[70954]: 0 Apr 21 10:57:14 user nova-compute[70954]: 1 Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: tempest-ServerRescueNegativeTestJSON-1656706265-project-member Apr 21 10:57:14 user nova-compute[70954]: tempest-ServerRescueNegativeTestJSON-1656706265 Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: OpenStack Foundation Apr 21 10:57:14 user nova-compute[70954]: OpenStack Nova Apr 21 10:57:14 user nova-compute[70954]: 0.0.0 Apr 21 10:57:14 user nova-compute[70954]: d1fca309-1d26-4a34-b932-716064b86b00 Apr 21 10:57:14 user nova-compute[70954]: d1fca309-1d26-4a34-b932-716064b86b00 Apr 21 10:57:14 user nova-compute[70954]: Virtual Machine Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: hvm Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Nehalem Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: /dev/urandom Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: Apr 21 10:57:14 user nova-compute[70954]: {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:57:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-6822012',display_name='tempest-ServerRescueNegativeTestJSON-server-6822012',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-6822012',id=21,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c0a611b8a8d54522929c37807054b2f6',ramdisk_id='',reservation_id='r-cfw0zm0m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-1656706265',owner_user_name='tempest-ServerRescueNegativeTestJSON-1656706265-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:57:12Z,user_data=None,user_id='0f73ac02062c4411bde0c97f6a719926',uuid=d1fca309-1d26-4a34-b932-716064b86b00,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8abc9260-fa02-4915-a056-63262b57e3be", "address": "fa:16:3e:8d:97:64", "network": {"id": "72777f52-fe61-4f05-b2c6-5edb74fb3138", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1022240465-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c0a611b8a8d54522929c37807054b2f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8abc9260-fa", "ovs_interfaceid": "8abc9260-fa02-4915-a056-63262b57e3be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Converting VIF {"id": "8abc9260-fa02-4915-a056-63262b57e3be", "address": "fa:16:3e:8d:97:64", "network": {"id": "72777f52-fe61-4f05-b2c6-5edb74fb3138", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1022240465-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c0a611b8a8d54522929c37807054b2f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8abc9260-fa", "ovs_interfaceid": "8abc9260-fa02-4915-a056-63262b57e3be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8d:97:64,bridge_name='br-int',has_traffic_filtering=True,id=8abc9260-fa02-4915-a056-63262b57e3be,network=Network(72777f52-fe61-4f05-b2c6-5edb74fb3138),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8abc9260-fa') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG os_vif [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8d:97:64,bridge_name='br-int',has_traffic_filtering=True,id=8abc9260-fa02-4915-a056-63262b57e3be,network=Network(72777f52-fe61-4f05-b2c6-5edb74fb3138),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8abc9260-fa') {{(pid=70954) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8abc9260-fa, may_exist=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8abc9260-fa, col_values=(('external_ids', {'iface-id': '8abc9260-fa02-4915-a056-63262b57e3be', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8d:97:64', 'vm-uuid': 'd1fca309-1d26-4a34-b932-716064b86b00'}),)) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:14 user nova-compute[70954]: INFO os_vif [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8d:97:64,bridge_name='br-int',has_traffic_filtering=True,id=8abc9260-fa02-4915-a056-63262b57e3be,network=Network(72777f52-fe61-4f05-b2c6-5edb74fb3138),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8abc9260-fa') Apr 21 10:57:14 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] No BDM found with device name vda, not building metadata. {{(pid=70954) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] No VIF found with MAC fa:16:3e:8d:97:64, not building metadata {{(pid=70954) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70954) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG nova.compute.manager [req-e0e04265-ea6e-4d1b-ac6f-5be79f2782ab req-9952e3b5-30a0-4e95-b1ff-8fdad6bd2064 service nova] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Received event network-changed-8abc9260-fa02-4915-a056-63262b57e3be {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG nova.compute.manager [req-e0e04265-ea6e-4d1b-ac6f-5be79f2782ab req-9952e3b5-30a0-4e95-b1ff-8fdad6bd2064 service nova] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Refreshing instance network info cache due to event network-changed-8abc9260-fa02-4915-a056-63262b57e3be. {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-e0e04265-ea6e-4d1b-ac6f-5be79f2782ab req-9952e3b5-30a0-4e95-b1ff-8fdad6bd2064 service nova] Acquiring lock "refresh_cache-d1fca309-1d26-4a34-b932-716064b86b00" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-e0e04265-ea6e-4d1b-ac6f-5be79f2782ab req-9952e3b5-30a0-4e95-b1ff-8fdad6bd2064 service nova] Acquired lock "refresh_cache-d1fca309-1d26-4a34-b932-716064b86b00" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:57:14 user nova-compute[70954]: DEBUG nova.network.neutron [req-e0e04265-ea6e-4d1b-ac6f-5be79f2782ab req-9952e3b5-30a0-4e95-b1ff-8fdad6bd2064 service nova] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Refreshing network info cache for port 8abc9260-fa02-4915-a056-63262b57e3be {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 10:57:15 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:15 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:15 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:15 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:15 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:15 user nova-compute[70954]: DEBUG nova.network.neutron [req-e0e04265-ea6e-4d1b-ac6f-5be79f2782ab req-9952e3b5-30a0-4e95-b1ff-8fdad6bd2064 service nova] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Updated VIF entry in instance network info cache for port 8abc9260-fa02-4915-a056-63262b57e3be. {{(pid=70954) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 10:57:15 user nova-compute[70954]: DEBUG nova.network.neutron [req-e0e04265-ea6e-4d1b-ac6f-5be79f2782ab req-9952e3b5-30a0-4e95-b1ff-8fdad6bd2064 service nova] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Updating instance_info_cache with network_info: [{"id": "8abc9260-fa02-4915-a056-63262b57e3be", "address": "fa:16:3e:8d:97:64", "network": {"id": "72777f52-fe61-4f05-b2c6-5edb74fb3138", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1022240465-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c0a611b8a8d54522929c37807054b2f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8abc9260-fa", "ovs_interfaceid": "8abc9260-fa02-4915-a056-63262b57e3be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:57:15 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-e0e04265-ea6e-4d1b-ac6f-5be79f2782ab req-9952e3b5-30a0-4e95-b1ff-8fdad6bd2064 service nova] Releasing lock "refresh_cache-d1fca309-1d26-4a34-b932-716064b86b00" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:57:16 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:16 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:16 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:16 user nova-compute[70954]: DEBUG nova.compute.manager [req-ea573502-067d-43d8-ba9a-205bd666c889 req-1324fa70-ae94-448e-bb7f-6c269b2d4c54 service nova] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Received event network-vif-plugged-8abc9260-fa02-4915-a056-63262b57e3be {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:57:16 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-ea573502-067d-43d8-ba9a-205bd666c889 req-1324fa70-ae94-448e-bb7f-6c269b2d4c54 service nova] Acquiring lock "d1fca309-1d26-4a34-b932-716064b86b00-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:57:16 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-ea573502-067d-43d8-ba9a-205bd666c889 req-1324fa70-ae94-448e-bb7f-6c269b2d4c54 service nova] Lock "d1fca309-1d26-4a34-b932-716064b86b00-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:57:16 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-ea573502-067d-43d8-ba9a-205bd666c889 req-1324fa70-ae94-448e-bb7f-6c269b2d4c54 service nova] Lock "d1fca309-1d26-4a34-b932-716064b86b00-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:57:16 user nova-compute[70954]: DEBUG nova.compute.manager [req-ea573502-067d-43d8-ba9a-205bd666c889 req-1324fa70-ae94-448e-bb7f-6c269b2d4c54 service nova] [instance: d1fca309-1d26-4a34-b932-716064b86b00] No waiting events found dispatching network-vif-plugged-8abc9260-fa02-4915-a056-63262b57e3be {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:57:16 user nova-compute[70954]: WARNING nova.compute.manager [req-ea573502-067d-43d8-ba9a-205bd666c889 req-1324fa70-ae94-448e-bb7f-6c269b2d4c54 service nova] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Received unexpected event network-vif-plugged-8abc9260-fa02-4915-a056-63262b57e3be for instance with vm_state building and task_state spawning. Apr 21 10:57:16 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:16 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:17 user nova-compute[70954]: DEBUG nova.compute.manager [req-d0d8a8ad-a6f5-42f2-9e80-f62ebf30e192 req-2601674b-3b94-404c-b2cb-941f991bdc16 service nova] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Received event network-vif-plugged-fb82372c-8c1a-43e8-9eba-5f5469b8ac66 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:57:17 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-d0d8a8ad-a6f5-42f2-9e80-f62ebf30e192 req-2601674b-3b94-404c-b2cb-941f991bdc16 service nova] Acquiring lock "476dbf2e-b02a-47bc-a8c6-6d0d66d5d433-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:57:17 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-d0d8a8ad-a6f5-42f2-9e80-f62ebf30e192 req-2601674b-3b94-404c-b2cb-941f991bdc16 service nova] Lock "476dbf2e-b02a-47bc-a8c6-6d0d66d5d433-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:57:17 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-d0d8a8ad-a6f5-42f2-9e80-f62ebf30e192 req-2601674b-3b94-404c-b2cb-941f991bdc16 service nova] Lock "476dbf2e-b02a-47bc-a8c6-6d0d66d5d433-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:57:17 user nova-compute[70954]: DEBUG nova.compute.manager [req-d0d8a8ad-a6f5-42f2-9e80-f62ebf30e192 req-2601674b-3b94-404c-b2cb-941f991bdc16 service nova] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] No waiting events found dispatching network-vif-plugged-fb82372c-8c1a-43e8-9eba-5f5469b8ac66 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:57:17 user nova-compute[70954]: WARNING nova.compute.manager [req-d0d8a8ad-a6f5-42f2-9e80-f62ebf30e192 req-2601674b-3b94-404c-b2cb-941f991bdc16 service nova] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Received unexpected event network-vif-plugged-fb82372c-8c1a-43e8-9eba-5f5469b8ac66 for instance with vm_state building and task_state spawning. Apr 21 10:57:17 user nova-compute[70954]: DEBUG nova.compute.manager [req-d0d8a8ad-a6f5-42f2-9e80-f62ebf30e192 req-2601674b-3b94-404c-b2cb-941f991bdc16 service nova] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Received event network-vif-plugged-fb82372c-8c1a-43e8-9eba-5f5469b8ac66 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:57:17 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-d0d8a8ad-a6f5-42f2-9e80-f62ebf30e192 req-2601674b-3b94-404c-b2cb-941f991bdc16 service nova] Acquiring lock "476dbf2e-b02a-47bc-a8c6-6d0d66d5d433-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:57:17 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-d0d8a8ad-a6f5-42f2-9e80-f62ebf30e192 req-2601674b-3b94-404c-b2cb-941f991bdc16 service nova] Lock "476dbf2e-b02a-47bc-a8c6-6d0d66d5d433-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:57:17 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-d0d8a8ad-a6f5-42f2-9e80-f62ebf30e192 req-2601674b-3b94-404c-b2cb-941f991bdc16 service nova] Lock "476dbf2e-b02a-47bc-a8c6-6d0d66d5d433-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:57:17 user nova-compute[70954]: DEBUG nova.compute.manager [req-d0d8a8ad-a6f5-42f2-9e80-f62ebf30e192 req-2601674b-3b94-404c-b2cb-941f991bdc16 service nova] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] No waiting events found dispatching network-vif-plugged-fb82372c-8c1a-43e8-9eba-5f5469b8ac66 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:57:17 user nova-compute[70954]: WARNING nova.compute.manager [req-d0d8a8ad-a6f5-42f2-9e80-f62ebf30e192 req-2601674b-3b94-404c-b2cb-941f991bdc16 service nova] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Received unexpected event network-vif-plugged-fb82372c-8c1a-43e8-9eba-5f5469b8ac66 for instance with vm_state building and task_state spawning. Apr 21 10:57:18 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Resumed> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:57:18 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] VM Resumed (Lifecycle Event) Apr 21 10:57:18 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:18 user nova-compute[70954]: DEBUG nova.compute.manager [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Instance event wait completed in 0 seconds for {{(pid=70954) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 10:57:18 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Guest created on hypervisor {{(pid=70954) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 10:57:18 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Instance spawned successfully. Apr 21 10:57:18 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 10:57:18 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:57:18 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:57:18 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Found default for hw_cdrom_bus of ide {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:57:18 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Found default for hw_disk_bus of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:57:18 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Found default for hw_input_bus of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:57:18 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Found default for hw_pointer_model of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:57:18 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Found default for hw_video_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:57:18 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Found default for hw_vif_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:57:18 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 10:57:18 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Started> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:57:18 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] VM Started (Lifecycle Event) Apr 21 10:57:18 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:57:18 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:57:18 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 10:57:18 user nova-compute[70954]: INFO nova.compute.manager [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Took 7.41 seconds to spawn the instance on the hypervisor. Apr 21 10:57:18 user nova-compute[70954]: DEBUG nova.compute.manager [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:57:18 user nova-compute[70954]: INFO nova.compute.manager [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Took 7.96 seconds to build instance. Apr 21 10:57:18 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-7cdc0be1-c923-476e-b012-e3f6cd0b8d95 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lock "476dbf2e-b02a-47bc-a8c6-6d0d66d5d433" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 8.057s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:57:18 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Resumed> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:57:18 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: d1fca309-1d26-4a34-b932-716064b86b00] VM Resumed (Lifecycle Event) Apr 21 10:57:18 user nova-compute[70954]: DEBUG nova.compute.manager [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Instance event wait completed in 0 seconds for {{(pid=70954) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 10:57:18 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Guest created on hypervisor {{(pid=70954) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 10:57:18 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Instance spawned successfully. Apr 21 10:57:18 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 10:57:18 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:57:18 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:57:18 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Found default for hw_cdrom_bus of ide {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:57:18 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Found default for hw_disk_bus of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:57:18 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Found default for hw_input_bus of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:57:18 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Found default for hw_pointer_model of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:57:18 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Found default for hw_video_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:57:18 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Found default for hw_vif_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:57:18 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: d1fca309-1d26-4a34-b932-716064b86b00] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 10:57:18 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Started> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:57:18 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: d1fca309-1d26-4a34-b932-716064b86b00] VM Started (Lifecycle Event) Apr 21 10:57:18 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:57:18 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:57:18 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: d1fca309-1d26-4a34-b932-716064b86b00] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 10:57:18 user nova-compute[70954]: INFO nova.compute.manager [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Took 6.84 seconds to spawn the instance on the hypervisor. Apr 21 10:57:18 user nova-compute[70954]: DEBUG nova.compute.manager [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:57:18 user nova-compute[70954]: DEBUG nova.compute.manager [req-51b6dd0b-b340-46da-8156-1489881872a7 req-9827126e-b236-4968-887f-a7adc17d5625 service nova] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Received event network-vif-plugged-8abc9260-fa02-4915-a056-63262b57e3be {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:57:18 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-51b6dd0b-b340-46da-8156-1489881872a7 req-9827126e-b236-4968-887f-a7adc17d5625 service nova] Acquiring lock "d1fca309-1d26-4a34-b932-716064b86b00-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:57:18 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-51b6dd0b-b340-46da-8156-1489881872a7 req-9827126e-b236-4968-887f-a7adc17d5625 service nova] Lock "d1fca309-1d26-4a34-b932-716064b86b00-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:57:18 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-51b6dd0b-b340-46da-8156-1489881872a7 req-9827126e-b236-4968-887f-a7adc17d5625 service nova] Lock "d1fca309-1d26-4a34-b932-716064b86b00-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:57:18 user nova-compute[70954]: DEBUG nova.compute.manager [req-51b6dd0b-b340-46da-8156-1489881872a7 req-9827126e-b236-4968-887f-a7adc17d5625 service nova] [instance: d1fca309-1d26-4a34-b932-716064b86b00] No waiting events found dispatching network-vif-plugged-8abc9260-fa02-4915-a056-63262b57e3be {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:57:18 user nova-compute[70954]: WARNING nova.compute.manager [req-51b6dd0b-b340-46da-8156-1489881872a7 req-9827126e-b236-4968-887f-a7adc17d5625 service nova] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Received unexpected event network-vif-plugged-8abc9260-fa02-4915-a056-63262b57e3be for instance with vm_state building and task_state spawning. Apr 21 10:57:18 user nova-compute[70954]: INFO nova.compute.manager [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Took 7.52 seconds to build instance. Apr 21 10:57:18 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-f674a2a2-76ef-4a7e-87c7-6690f36e17b2 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lock "d1fca309-1d26-4a34-b932-716064b86b00" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.658s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:57:19 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:23 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:24 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:29 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:57:33 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:34 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:38 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:39 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:43 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:43 user nova-compute[70954]: DEBUG nova.compute.manager [req-226980fa-d787-49f6-84e8-cfb64cc6030d req-9f7d12ee-6219-4974-ba1a-6fd05dbb95b4 service nova] [instance: c70df604-601e-4451-828d-20c649b6052a] Received event network-changed-5751ca80-5041-4999-b832-b427fb0af8a2 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:57:43 user nova-compute[70954]: DEBUG nova.compute.manager [req-226980fa-d787-49f6-84e8-cfb64cc6030d req-9f7d12ee-6219-4974-ba1a-6fd05dbb95b4 service nova] [instance: c70df604-601e-4451-828d-20c649b6052a] Refreshing instance network info cache due to event network-changed-5751ca80-5041-4999-b832-b427fb0af8a2. {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 10:57:43 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-226980fa-d787-49f6-84e8-cfb64cc6030d req-9f7d12ee-6219-4974-ba1a-6fd05dbb95b4 service nova] Acquiring lock "refresh_cache-c70df604-601e-4451-828d-20c649b6052a" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:57:43 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-226980fa-d787-49f6-84e8-cfb64cc6030d req-9f7d12ee-6219-4974-ba1a-6fd05dbb95b4 service nova] Acquired lock "refresh_cache-c70df604-601e-4451-828d-20c649b6052a" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:57:43 user nova-compute[70954]: DEBUG nova.network.neutron [req-226980fa-d787-49f6-84e8-cfb64cc6030d req-9f7d12ee-6219-4974-ba1a-6fd05dbb95b4 service nova] [instance: c70df604-601e-4451-828d-20c649b6052a] Refreshing network info cache for port 5751ca80-5041-4999-b832-b427fb0af8a2 {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 10:57:43 user nova-compute[70954]: DEBUG nova.network.neutron [req-226980fa-d787-49f6-84e8-cfb64cc6030d req-9f7d12ee-6219-4974-ba1a-6fd05dbb95b4 service nova] [instance: c70df604-601e-4451-828d-20c649b6052a] Updated VIF entry in instance network info cache for port 5751ca80-5041-4999-b832-b427fb0af8a2. {{(pid=70954) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 10:57:43 user nova-compute[70954]: DEBUG nova.network.neutron [req-226980fa-d787-49f6-84e8-cfb64cc6030d req-9f7d12ee-6219-4974-ba1a-6fd05dbb95b4 service nova] [instance: c70df604-601e-4451-828d-20c649b6052a] Updating instance_info_cache with network_info: [{"id": "5751ca80-5041-4999-b832-b427fb0af8a2", "address": "fa:16:3e:5f:74:cd", "network": {"id": "e0ccd2d9-69df-40e0-be8e-8328039f1bd0", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-587901453-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.41", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "14bc6b0c20204c8287b3523814007856", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5751ca80-50", "ovs_interfaceid": "5751ca80-5041-4999-b832-b427fb0af8a2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:57:44 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-226980fa-d787-49f6-84e8-cfb64cc6030d req-9f7d12ee-6219-4974-ba1a-6fd05dbb95b4 service nova] Releasing lock "refresh_cache-c70df604-601e-4451-828d-20c649b6052a" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:57:44 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:44 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-6b6e9085-05a2-4e48-9507-4cdca1d4b200 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Acquiring lock "c70df604-601e-4451-828d-20c649b6052a" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:57:44 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-6b6e9085-05a2-4e48-9507-4cdca1d4b200 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "c70df604-601e-4451-828d-20c649b6052a" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:57:44 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-6b6e9085-05a2-4e48-9507-4cdca1d4b200 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Acquiring lock "c70df604-601e-4451-828d-20c649b6052a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:57:44 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-6b6e9085-05a2-4e48-9507-4cdca1d4b200 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "c70df604-601e-4451-828d-20c649b6052a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:57:44 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-6b6e9085-05a2-4e48-9507-4cdca1d4b200 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "c70df604-601e-4451-828d-20c649b6052a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:57:44 user nova-compute[70954]: INFO nova.compute.manager [None req-6b6e9085-05a2-4e48-9507-4cdca1d4b200 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: c70df604-601e-4451-828d-20c649b6052a] Terminating instance Apr 21 10:57:44 user nova-compute[70954]: DEBUG nova.compute.manager [None req-6b6e9085-05a2-4e48-9507-4cdca1d4b200 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: c70df604-601e-4451-828d-20c649b6052a] Start destroying the instance on the hypervisor. {{(pid=70954) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 10:57:45 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:45 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:45 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:45 user nova-compute[70954]: DEBUG nova.compute.manager [req-71370ddf-2cc0-462a-90cb-44bba6f9b095 req-d61f3e69-6b45-4315-ae7e-6cb685354721 service nova] [instance: c70df604-601e-4451-828d-20c649b6052a] Received event network-vif-unplugged-5751ca80-5041-4999-b832-b427fb0af8a2 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:57:45 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-71370ddf-2cc0-462a-90cb-44bba6f9b095 req-d61f3e69-6b45-4315-ae7e-6cb685354721 service nova] Acquiring lock "c70df604-601e-4451-828d-20c649b6052a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:57:45 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-71370ddf-2cc0-462a-90cb-44bba6f9b095 req-d61f3e69-6b45-4315-ae7e-6cb685354721 service nova] Lock "c70df604-601e-4451-828d-20c649b6052a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:57:45 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-71370ddf-2cc0-462a-90cb-44bba6f9b095 req-d61f3e69-6b45-4315-ae7e-6cb685354721 service nova] Lock "c70df604-601e-4451-828d-20c649b6052a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:57:45 user nova-compute[70954]: DEBUG nova.compute.manager [req-71370ddf-2cc0-462a-90cb-44bba6f9b095 req-d61f3e69-6b45-4315-ae7e-6cb685354721 service nova] [instance: c70df604-601e-4451-828d-20c649b6052a] No waiting events found dispatching network-vif-unplugged-5751ca80-5041-4999-b832-b427fb0af8a2 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:57:45 user nova-compute[70954]: DEBUG nova.compute.manager [req-71370ddf-2cc0-462a-90cb-44bba6f9b095 req-d61f3e69-6b45-4315-ae7e-6cb685354721 service nova] [instance: c70df604-601e-4451-828d-20c649b6052a] Received event network-vif-unplugged-5751ca80-5041-4999-b832-b427fb0af8a2 for instance with task_state deleting. {{(pid=70954) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 10:57:45 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: c70df604-601e-4451-828d-20c649b6052a] Instance destroyed successfully. Apr 21 10:57:45 user nova-compute[70954]: DEBUG nova.objects.instance [None req-6b6e9085-05a2-4e48-9507-4cdca1d4b200 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lazy-loading 'resources' on Instance uuid c70df604-601e-4451-828d-20c649b6052a {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:57:45 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-6b6e9085-05a2-4e48-9507-4cdca1d4b200 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:55:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1645771321',display_name='tempest-AttachVolumeNegativeTest-server-1645771321',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-1645771321',id=18,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHp6Xpoy4by46AxWCAbGiYDwkQpafANsc/yEMfhdZjZu+ouzXhRTZE2gofYmbc0DufFTp50aDb84APASGieWAMislJ/20uHZwVFUyEEqhMGv/VNdPqJyM1DgvCFkYCmKMQ==',key_name='tempest-keypair-610711731',keypairs=,launch_index=0,launched_at=2023-04-21T10:55:57Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='14bc6b0c20204c8287b3523814007856',ramdisk_id='',reservation_id='r-7t5z091m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeNegativeTest-159654333',owner_user_name='tempest-AttachVolumeNegativeTest-159654333-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T10:55:58Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d7fc66871488428e9842404d885bcfe3',uuid=c70df604-601e-4451-828d-20c649b6052a,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5751ca80-5041-4999-b832-b427fb0af8a2", "address": "fa:16:3e:5f:74:cd", "network": {"id": "e0ccd2d9-69df-40e0-be8e-8328039f1bd0", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-587901453-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.41", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "14bc6b0c20204c8287b3523814007856", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5751ca80-50", "ovs_interfaceid": "5751ca80-5041-4999-b832-b427fb0af8a2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 10:57:45 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-6b6e9085-05a2-4e48-9507-4cdca1d4b200 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Converting VIF {"id": "5751ca80-5041-4999-b832-b427fb0af8a2", "address": "fa:16:3e:5f:74:cd", "network": {"id": "e0ccd2d9-69df-40e0-be8e-8328039f1bd0", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-587901453-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.41", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "14bc6b0c20204c8287b3523814007856", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5751ca80-50", "ovs_interfaceid": "5751ca80-5041-4999-b832-b427fb0af8a2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:57:45 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-6b6e9085-05a2-4e48-9507-4cdca1d4b200 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5f:74:cd,bridge_name='br-int',has_traffic_filtering=True,id=5751ca80-5041-4999-b832-b427fb0af8a2,network=Network(e0ccd2d9-69df-40e0-be8e-8328039f1bd0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5751ca80-50') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:57:45 user nova-compute[70954]: DEBUG os_vif [None req-6b6e9085-05a2-4e48-9507-4cdca1d4b200 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5f:74:cd,bridge_name='br-int',has_traffic_filtering=True,id=5751ca80-5041-4999-b832-b427fb0af8a2,network=Network(e0ccd2d9-69df-40e0-be8e-8328039f1bd0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5751ca80-50') {{(pid=70954) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 10:57:45 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:45 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5751ca80-50, bridge=br-int, if_exists=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:57:45 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:45 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:57:45 user nova-compute[70954]: INFO os_vif [None req-6b6e9085-05a2-4e48-9507-4cdca1d4b200 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5f:74:cd,bridge_name='br-int',has_traffic_filtering=True,id=5751ca80-5041-4999-b832-b427fb0af8a2,network=Network(e0ccd2d9-69df-40e0-be8e-8328039f1bd0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5751ca80-50') Apr 21 10:57:45 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-6b6e9085-05a2-4e48-9507-4cdca1d4b200 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: c70df604-601e-4451-828d-20c649b6052a] Deleting instance files /opt/stack/data/nova/instances/c70df604-601e-4451-828d-20c649b6052a_del Apr 21 10:57:45 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-6b6e9085-05a2-4e48-9507-4cdca1d4b200 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: c70df604-601e-4451-828d-20c649b6052a] Deletion of /opt/stack/data/nova/instances/c70df604-601e-4451-828d-20c649b6052a_del complete Apr 21 10:57:45 user nova-compute[70954]: INFO nova.compute.manager [None req-6b6e9085-05a2-4e48-9507-4cdca1d4b200 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: c70df604-601e-4451-828d-20c649b6052a] Took 0.66 seconds to destroy the instance on the hypervisor. Apr 21 10:57:45 user nova-compute[70954]: DEBUG oslo.service.loopingcall [None req-6b6e9085-05a2-4e48-9507-4cdca1d4b200 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70954) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 10:57:45 user nova-compute[70954]: DEBUG nova.compute.manager [-] [instance: c70df604-601e-4451-828d-20c649b6052a] Deallocating network for instance {{(pid=70954) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 10:57:45 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: c70df604-601e-4451-828d-20c649b6052a] deallocate_for_instance() {{(pid=70954) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 10:57:46 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:46 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: c70df604-601e-4451-828d-20c649b6052a] Updating instance_info_cache with network_info: [] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:57:46 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: c70df604-601e-4451-828d-20c649b6052a] Took 0.66 seconds to deallocate network for instance. Apr 21 10:57:46 user nova-compute[70954]: DEBUG nova.compute.manager [req-1ccf935a-9a5e-4b90-a9d7-0144934648ae req-b1ed3832-30c0-4115-914d-1184d4bf4d51 service nova] [instance: c70df604-601e-4451-828d-20c649b6052a] Received event network-vif-deleted-5751ca80-5041-4999-b832-b427fb0af8a2 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:57:46 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-6b6e9085-05a2-4e48-9507-4cdca1d4b200 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:57:46 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-6b6e9085-05a2-4e48-9507-4cdca1d4b200 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:57:46 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-6b6e9085-05a2-4e48-9507-4cdca1d4b200 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:57:46 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-6b6e9085-05a2-4e48-9507-4cdca1d4b200 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:57:46 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-6b6e9085-05a2-4e48-9507-4cdca1d4b200 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.206s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:57:46 user nova-compute[70954]: INFO nova.scheduler.client.report [None req-6b6e9085-05a2-4e48-9507-4cdca1d4b200 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Deleted allocations for instance c70df604-601e-4451-828d-20c649b6052a Apr 21 10:57:46 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-6b6e9085-05a2-4e48-9507-4cdca1d4b200 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "c70df604-601e-4451-828d-20c649b6052a" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.700s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:57:47 user nova-compute[70954]: DEBUG nova.compute.manager [req-a2db9593-d669-4c0a-8a41-f4e65845b4d1 req-207ba456-9b57-4b73-bd20-26078c87d73c service nova] [instance: c70df604-601e-4451-828d-20c649b6052a] Received event network-vif-plugged-5751ca80-5041-4999-b832-b427fb0af8a2 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:57:47 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-a2db9593-d669-4c0a-8a41-f4e65845b4d1 req-207ba456-9b57-4b73-bd20-26078c87d73c service nova] Acquiring lock "c70df604-601e-4451-828d-20c649b6052a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:57:47 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-a2db9593-d669-4c0a-8a41-f4e65845b4d1 req-207ba456-9b57-4b73-bd20-26078c87d73c service nova] Lock "c70df604-601e-4451-828d-20c649b6052a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:57:47 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-a2db9593-d669-4c0a-8a41-f4e65845b4d1 req-207ba456-9b57-4b73-bd20-26078c87d73c service nova] Lock "c70df604-601e-4451-828d-20c649b6052a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:57:47 user nova-compute[70954]: DEBUG nova.compute.manager [req-a2db9593-d669-4c0a-8a41-f4e65845b4d1 req-207ba456-9b57-4b73-bd20-26078c87d73c service nova] [instance: c70df604-601e-4451-828d-20c649b6052a] No waiting events found dispatching network-vif-plugged-5751ca80-5041-4999-b832-b427fb0af8a2 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:57:47 user nova-compute[70954]: WARNING nova.compute.manager [req-a2db9593-d669-4c0a-8a41-f4e65845b4d1 req-207ba456-9b57-4b73-bd20-26078c87d73c service nova] [instance: c70df604-601e-4451-828d-20c649b6052a] Received unexpected event network-vif-plugged-5751ca80-5041-4999-b832-b427fb0af8a2 for instance with vm_state deleted and task_state None. Apr 21 10:57:48 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:50 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:55 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:57:58 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:58:00 user nova-compute[70954]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:58:00 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: c70df604-601e-4451-828d-20c649b6052a] VM Stopped (Lifecycle Event) Apr 21 10:58:00 user nova-compute[70954]: DEBUG nova.compute.manager [None req-ad0b10b1-bf9a-44a7-a056-0918cc5e7a8f None None] [instance: c70df604-601e-4451-828d-20c649b6052a] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:58:00 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:58:04 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:58:04 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Starting heal instance info cache {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 10:58:04 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Rebuilding the list of instances to heal {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 21 10:58:04 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "refresh_cache-595d41a4-9a01-4aa2-96a1-c2c763475184" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:58:04 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquired lock "refresh_cache-595d41a4-9a01-4aa2-96a1-c2c763475184" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:58:04 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Forcefully refreshing network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 21 10:58:04 user nova-compute[70954]: DEBUG nova.objects.instance [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lazy-loading 'info_cache' on Instance uuid 595d41a4-9a01-4aa2-96a1-c2c763475184 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:58:05 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Updating instance_info_cache with network_info: [{"id": "eb0b0125-965b-4825-aab1-3ba81be44c2f", "address": "fa:16:3e:13:39:c1", "network": {"id": "3e633eed-7c28-4111-849c-3ab0f46c0c5c", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1483635329-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "acc12d15daf34c5e9d26a6cc53795efe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb0b0125-96", "ovs_interfaceid": "eb0b0125-965b-4825-aab1-3ba81be44c2f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:58:05 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Releasing lock "refresh_cache-595d41a4-9a01-4aa2-96a1-c2c763475184" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:58:05 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Updated the network info_cache for instance {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 21 10:58:05 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:58:05 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Cleaning up deleted instances with incomplete migration {{(pid=70954) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 21 10:58:05 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:58:06 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:58:07 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:58:07 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:58:07 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:58:08 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager.update_available_resource {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:58:08 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:58:08 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:58:08 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:58:08 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Auditing locally available compute resources for user (node: user) {{(pid=70954) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 10:58:08 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:58:09 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:58:09 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:58:09 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00/disk --force-share --output=json" returned: 0 in 0.145s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:58:09 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:58:09 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:58:09 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:58:09 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:58:09 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/476dbf2e-b02a-47bc-a8c6-6d0d66d5d433/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:58:09 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/476dbf2e-b02a-47bc-a8c6-6d0d66d5d433/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:58:09 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/476dbf2e-b02a-47bc-a8c6-6d0d66d5d433/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:58:09 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/476dbf2e-b02a-47bc-a8c6-6d0d66d5d433/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:58:10 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:58:10 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:58:10 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Hypervisor/Node resource view: name=user free_ram=8887MB free_disk=26.521549224853516GB free_vcpus=9 pci_devices=[{"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70954) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 10:58:10 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:58:10 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:58:10 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 595d41a4-9a01-4aa2-96a1-c2c763475184 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:58:10 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:58:10 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance d1fca309-1d26-4a34-b932-716064b86b00 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:58:10 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Total usable vcpus: 12, total allocated vcpus: 3 {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 10:58:10 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Final resource view: name=user phys_ram=16023MB used_ram=896MB phys_disk=40GB used_disk=3GB total_vcpus=12 used_vcpus=3 pci_stats=[] {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 10:58:10 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:58:10 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:58:10 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Compute_service record updated for user:user {{(pid=70954) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 10:58:10 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.252s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:58:10 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:58:12 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:58:12 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:58:15 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:58:15 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:58:15 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70954) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 10:58:15 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:58:17 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:58:17 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Cleaning up deleted instances {{(pid=70954) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 21 10:58:17 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] There are 0 instances to clean {{(pid=70954) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 21 10:58:20 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:58:25 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:58:26 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Acquiring lock "32a0063e-076e-4585-981e-fe853499aee3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:58:26 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Lock "32a0063e-076e-4585-981e-fe853499aee3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:58:26 user nova-compute[70954]: DEBUG nova.compute.manager [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Starting instance... {{(pid=70954) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 10:58:26 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:58:26 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.003s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:58:26 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70954) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 10:58:26 user nova-compute[70954]: INFO nova.compute.claims [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Claim successful on node user Apr 21 10:58:27 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:58:27 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:58:27 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.244s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:58:27 user nova-compute[70954]: DEBUG nova.compute.manager [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Start building networks asynchronously for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 10:58:27 user nova-compute[70954]: DEBUG nova.compute.manager [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Allocating IP information in the background. {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 10:58:27 user nova-compute[70954]: DEBUG nova.network.neutron [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 32a0063e-076e-4585-981e-fe853499aee3] allocate_for_instance() {{(pid=70954) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 10:58:27 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 10:58:27 user nova-compute[70954]: DEBUG nova.compute.manager [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Start building block device mappings for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 10:58:27 user nova-compute[70954]: DEBUG nova.policy [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '95956d2e4ea84534b6d5628eb8dd184d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'acc12d15daf34c5e9d26a6cc53795efe', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70954) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 10:58:27 user nova-compute[70954]: DEBUG nova.compute.manager [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Start spawning the instance on the hypervisor. {{(pid=70954) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 10:58:27 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Creating instance directory {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 10:58:27 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Creating image(s) Apr 21 10:58:27 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Acquiring lock "/opt/stack/data/nova/instances/32a0063e-076e-4585-981e-fe853499aee3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:58:27 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Lock "/opt/stack/data/nova/instances/32a0063e-076e-4585-981e-fe853499aee3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:58:27 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Lock "/opt/stack/data/nova/instances/32a0063e-076e-4585-981e-fe853499aee3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:58:27 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:58:27 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.143s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:58:27 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Acquiring lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:58:27 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:58:27 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:58:27 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.143s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:58:27 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/32a0063e-076e-4585-981e-fe853499aee3/disk 1073741824 {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:58:27 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/32a0063e-076e-4585-981e-fe853499aee3/disk 1073741824" returned: 0 in 0.049s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:58:27 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.199s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:58:27 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:58:27 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.139s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:58:27 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Checking if we can resize image /opt/stack/data/nova/instances/32a0063e-076e-4585-981e-fe853499aee3/disk. size=1073741824 {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 10:58:27 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/32a0063e-076e-4585-981e-fe853499aee3/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:58:27 user nova-compute[70954]: DEBUG nova.network.neutron [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Successfully created port: 3e412892-1f09-4bcc-8126-bf7d69b8b2d2 {{(pid=70954) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 10:58:27 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/32a0063e-076e-4585-981e-fe853499aee3/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:58:27 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Cannot resize image /opt/stack/data/nova/instances/32a0063e-076e-4585-981e-fe853499aee3/disk to a smaller size. {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 10:58:27 user nova-compute[70954]: DEBUG nova.objects.instance [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Lazy-loading 'migration_context' on Instance uuid 32a0063e-076e-4585-981e-fe853499aee3 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:58:27 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Created local disks {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 10:58:27 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Ensure instance console log exists: /opt/stack/data/nova/instances/32a0063e-076e-4585-981e-fe853499aee3/console.log {{(pid=70954) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 10:58:27 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:58:27 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:58:27 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:58:28 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:58:28 user nova-compute[70954]: DEBUG nova.network.neutron [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Successfully updated port: 3e412892-1f09-4bcc-8126-bf7d69b8b2d2 {{(pid=70954) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 10:58:28 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Acquiring lock "refresh_cache-32a0063e-076e-4585-981e-fe853499aee3" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:58:28 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Acquired lock "refresh_cache-32a0063e-076e-4585-981e-fe853499aee3" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:58:28 user nova-compute[70954]: DEBUG nova.network.neutron [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Building network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 10:58:28 user nova-compute[70954]: DEBUG nova.compute.manager [req-d795d514-3df8-490b-be28-fc4953178de6 req-b1519a56-7814-4d39-b176-cd279bfd43ce service nova] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Received event network-changed-3e412892-1f09-4bcc-8126-bf7d69b8b2d2 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:58:28 user nova-compute[70954]: DEBUG nova.compute.manager [req-d795d514-3df8-490b-be28-fc4953178de6 req-b1519a56-7814-4d39-b176-cd279bfd43ce service nova] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Refreshing instance network info cache due to event network-changed-3e412892-1f09-4bcc-8126-bf7d69b8b2d2. {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 10:58:28 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-d795d514-3df8-490b-be28-fc4953178de6 req-b1519a56-7814-4d39-b176-cd279bfd43ce service nova] Acquiring lock "refresh_cache-32a0063e-076e-4585-981e-fe853499aee3" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:58:28 user nova-compute[70954]: DEBUG nova.network.neutron [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Instance cache missing network info. {{(pid=70954) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 10:58:29 user nova-compute[70954]: DEBUG nova.network.neutron [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Updating instance_info_cache with network_info: [{"id": "3e412892-1f09-4bcc-8126-bf7d69b8b2d2", "address": "fa:16:3e:67:64:57", "network": {"id": "3e633eed-7c28-4111-849c-3ab0f46c0c5c", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1483635329-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "acc12d15daf34c5e9d26a6cc53795efe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e412892-1f", "ovs_interfaceid": "3e412892-1f09-4bcc-8126-bf7d69b8b2d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:58:29 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Releasing lock "refresh_cache-32a0063e-076e-4585-981e-fe853499aee3" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:58:29 user nova-compute[70954]: DEBUG nova.compute.manager [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Instance network_info: |[{"id": "3e412892-1f09-4bcc-8126-bf7d69b8b2d2", "address": "fa:16:3e:67:64:57", "network": {"id": "3e633eed-7c28-4111-849c-3ab0f46c0c5c", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1483635329-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "acc12d15daf34c5e9d26a6cc53795efe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e412892-1f", "ovs_interfaceid": "3e412892-1f09-4bcc-8126-bf7d69b8b2d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 10:58:29 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-d795d514-3df8-490b-be28-fc4953178de6 req-b1519a56-7814-4d39-b176-cd279bfd43ce service nova] Acquired lock "refresh_cache-32a0063e-076e-4585-981e-fe853499aee3" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:58:29 user nova-compute[70954]: DEBUG nova.network.neutron [req-d795d514-3df8-490b-be28-fc4953178de6 req-b1519a56-7814-4d39-b176-cd279bfd43ce service nova] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Refreshing network info cache for port 3e412892-1f09-4bcc-8126-bf7d69b8b2d2 {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 10:58:29 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Start _get_guest_xml network_info=[{"id": "3e412892-1f09-4bcc-8126-bf7d69b8b2d2", "address": "fa:16:3e:67:64:57", "network": {"id": "3e633eed-7c28-4111-849c-3ab0f46c0c5c", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1483635329-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "acc12d15daf34c5e9d26a6cc53795efe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e412892-1f", "ovs_interfaceid": "3e412892-1f09-4bcc-8126-bf7d69b8b2d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:43:25Z,direct_url=,disk_format='qcow2',id=3b29a01a-1fc0-4d0d-89fb-23d22b2de02e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='a3109aa78f014d0da3638064a889676d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:43:26Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'size': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'disk_bus': 'virtio', 'encryption_format': None, 'guest_format': None, 'image_id': '3b29a01a-1fc0-4d0d-89fb-23d22b2de02e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 10:58:29 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:58:29 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:58:29 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70954) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 10:58:29 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T10:44:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:43:25Z,direct_url=,disk_format='qcow2',id=3b29a01a-1fc0-4d0d-89fb-23d22b2de02e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='a3109aa78f014d0da3638064a889676d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:43:26Z,virtual_size=,visibility=), allow threads: True {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 10:58:29 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Flavor limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 10:58:29 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Image limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 10:58:29 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Flavor pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 10:58:29 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Image pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 10:58:29 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 10:58:29 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 10:58:29 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 10:58:29 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Got 1 possible topologies {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 10:58:29 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 10:58:29 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 10:58:29 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:58:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1462598035',display_name='tempest-ServersNegativeTestJSON-server-1462598035',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-1462598035',id=22,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='acc12d15daf34c5e9d26a6cc53795efe',ramdisk_id='',reservation_id='r-zina7ghx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1265836442',owner_user_name='tempest-ServersNegativeTestJSON-1265836442-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:58:27Z,user_data=None,user_id='95956d2e4ea84534b6d5628eb8dd184d',uuid=32a0063e-076e-4585-981e-fe853499aee3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3e412892-1f09-4bcc-8126-bf7d69b8b2d2", "address": "fa:16:3e:67:64:57", "network": {"id": "3e633eed-7c28-4111-849c-3ab0f46c0c5c", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1483635329-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "acc12d15daf34c5e9d26a6cc53795efe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e412892-1f", "ovs_interfaceid": "3e412892-1f09-4bcc-8126-bf7d69b8b2d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70954) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 10:58:29 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Converting VIF {"id": "3e412892-1f09-4bcc-8126-bf7d69b8b2d2", "address": "fa:16:3e:67:64:57", "network": {"id": "3e633eed-7c28-4111-849c-3ab0f46c0c5c", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1483635329-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "acc12d15daf34c5e9d26a6cc53795efe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e412892-1f", "ovs_interfaceid": "3e412892-1f09-4bcc-8126-bf7d69b8b2d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:58:29 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:67:64:57,bridge_name='br-int',has_traffic_filtering=True,id=3e412892-1f09-4bcc-8126-bf7d69b8b2d2,network=Network(3e633eed-7c28-4111-849c-3ab0f46c0c5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e412892-1f') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:58:29 user nova-compute[70954]: DEBUG nova.objects.instance [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Lazy-loading 'pci_devices' on Instance uuid 32a0063e-076e-4585-981e-fe853499aee3 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:58:29 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 32a0063e-076e-4585-981e-fe853499aee3] End _get_guest_xml xml= Apr 21 10:58:29 user nova-compute[70954]: 32a0063e-076e-4585-981e-fe853499aee3 Apr 21 10:58:29 user nova-compute[70954]: instance-00000016 Apr 21 10:58:29 user nova-compute[70954]: 131072 Apr 21 10:58:29 user nova-compute[70954]: 1 Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: tempest-ServersNegativeTestJSON-server-1462598035 Apr 21 10:58:29 user nova-compute[70954]: 2023-04-21 10:58:29 Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: 128 Apr 21 10:58:29 user nova-compute[70954]: 1 Apr 21 10:58:29 user nova-compute[70954]: 0 Apr 21 10:58:29 user nova-compute[70954]: 0 Apr 21 10:58:29 user nova-compute[70954]: 1 Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: tempest-ServersNegativeTestJSON-1265836442-project-member Apr 21 10:58:29 user nova-compute[70954]: tempest-ServersNegativeTestJSON-1265836442 Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: OpenStack Foundation Apr 21 10:58:29 user nova-compute[70954]: OpenStack Nova Apr 21 10:58:29 user nova-compute[70954]: 0.0.0 Apr 21 10:58:29 user nova-compute[70954]: 32a0063e-076e-4585-981e-fe853499aee3 Apr 21 10:58:29 user nova-compute[70954]: 32a0063e-076e-4585-981e-fe853499aee3 Apr 21 10:58:29 user nova-compute[70954]: Virtual Machine Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: hvm Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Nehalem Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: /dev/urandom Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: Apr 21 10:58:29 user nova-compute[70954]: {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 10:58:29 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:58:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1462598035',display_name='tempest-ServersNegativeTestJSON-server-1462598035',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-1462598035',id=22,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='acc12d15daf34c5e9d26a6cc53795efe',ramdisk_id='',reservation_id='r-zina7ghx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1265836442',owner_user_name='tempest-ServersNegativeTestJSON-1265836442-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:58:27Z,user_data=None,user_id='95956d2e4ea84534b6d5628eb8dd184d',uuid=32a0063e-076e-4585-981e-fe853499aee3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3e412892-1f09-4bcc-8126-bf7d69b8b2d2", "address": "fa:16:3e:67:64:57", "network": {"id": "3e633eed-7c28-4111-849c-3ab0f46c0c5c", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1483635329-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "acc12d15daf34c5e9d26a6cc53795efe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e412892-1f", "ovs_interfaceid": "3e412892-1f09-4bcc-8126-bf7d69b8b2d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 10:58:29 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Converting VIF {"id": "3e412892-1f09-4bcc-8126-bf7d69b8b2d2", "address": "fa:16:3e:67:64:57", "network": {"id": "3e633eed-7c28-4111-849c-3ab0f46c0c5c", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1483635329-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "acc12d15daf34c5e9d26a6cc53795efe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e412892-1f", "ovs_interfaceid": "3e412892-1f09-4bcc-8126-bf7d69b8b2d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:58:29 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:67:64:57,bridge_name='br-int',has_traffic_filtering=True,id=3e412892-1f09-4bcc-8126-bf7d69b8b2d2,network=Network(3e633eed-7c28-4111-849c-3ab0f46c0c5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e412892-1f') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:58:29 user nova-compute[70954]: DEBUG os_vif [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:67:64:57,bridge_name='br-int',has_traffic_filtering=True,id=3e412892-1f09-4bcc-8126-bf7d69b8b2d2,network=Network(3e633eed-7c28-4111-849c-3ab0f46c0c5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e412892-1f') {{(pid=70954) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 10:58:29 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:58:29 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:58:29 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 10:58:29 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:58:29 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3e412892-1f, may_exist=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:58:29 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3e412892-1f, col_values=(('external_ids', {'iface-id': '3e412892-1f09-4bcc-8126-bf7d69b8b2d2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:67:64:57', 'vm-uuid': '32a0063e-076e-4585-981e-fe853499aee3'}),)) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:58:29 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:58:29 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:58:29 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:58:29 user nova-compute[70954]: INFO os_vif [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:67:64:57,bridge_name='br-int',has_traffic_filtering=True,id=3e412892-1f09-4bcc-8126-bf7d69b8b2d2,network=Network(3e633eed-7c28-4111-849c-3ab0f46c0c5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e412892-1f') Apr 21 10:58:29 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] No BDM found with device name vda, not building metadata. {{(pid=70954) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 10:58:29 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] No VIF found with MAC fa:16:3e:67:64:57, not building metadata {{(pid=70954) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 10:58:29 user nova-compute[70954]: DEBUG nova.network.neutron [req-d795d514-3df8-490b-be28-fc4953178de6 req-b1519a56-7814-4d39-b176-cd279bfd43ce service nova] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Updated VIF entry in instance network info cache for port 3e412892-1f09-4bcc-8126-bf7d69b8b2d2. {{(pid=70954) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 10:58:29 user nova-compute[70954]: DEBUG nova.network.neutron [req-d795d514-3df8-490b-be28-fc4953178de6 req-b1519a56-7814-4d39-b176-cd279bfd43ce service nova] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Updating instance_info_cache with network_info: [{"id": "3e412892-1f09-4bcc-8126-bf7d69b8b2d2", "address": "fa:16:3e:67:64:57", "network": {"id": "3e633eed-7c28-4111-849c-3ab0f46c0c5c", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1483635329-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "acc12d15daf34c5e9d26a6cc53795efe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e412892-1f", "ovs_interfaceid": "3e412892-1f09-4bcc-8126-bf7d69b8b2d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:58:29 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-d795d514-3df8-490b-be28-fc4953178de6 req-b1519a56-7814-4d39-b176-cd279bfd43ce service nova] Releasing lock "refresh_cache-32a0063e-076e-4585-981e-fe853499aee3" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:58:30 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:58:30 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:58:30 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:58:30 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:58:30 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:58:31 user nova-compute[70954]: DEBUG nova.compute.manager [req-23d06e58-bc79-4b2f-8542-c5dc4deda75e req-f544c38f-d7e5-4567-90cf-6be29af8cb7b service nova] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Received event network-vif-plugged-3e412892-1f09-4bcc-8126-bf7d69b8b2d2 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:58:31 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-23d06e58-bc79-4b2f-8542-c5dc4deda75e req-f544c38f-d7e5-4567-90cf-6be29af8cb7b service nova] Acquiring lock "32a0063e-076e-4585-981e-fe853499aee3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:58:31 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-23d06e58-bc79-4b2f-8542-c5dc4deda75e req-f544c38f-d7e5-4567-90cf-6be29af8cb7b service nova] Lock "32a0063e-076e-4585-981e-fe853499aee3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:58:31 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-23d06e58-bc79-4b2f-8542-c5dc4deda75e req-f544c38f-d7e5-4567-90cf-6be29af8cb7b service nova] Lock "32a0063e-076e-4585-981e-fe853499aee3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:58:31 user nova-compute[70954]: DEBUG nova.compute.manager [req-23d06e58-bc79-4b2f-8542-c5dc4deda75e req-f544c38f-d7e5-4567-90cf-6be29af8cb7b service nova] [instance: 32a0063e-076e-4585-981e-fe853499aee3] No waiting events found dispatching network-vif-plugged-3e412892-1f09-4bcc-8126-bf7d69b8b2d2 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:58:31 user nova-compute[70954]: WARNING nova.compute.manager [req-23d06e58-bc79-4b2f-8542-c5dc4deda75e req-f544c38f-d7e5-4567-90cf-6be29af8cb7b service nova] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Received unexpected event network-vif-plugged-3e412892-1f09-4bcc-8126-bf7d69b8b2d2 for instance with vm_state building and task_state spawning. Apr 21 10:58:31 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:58:32 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Resumed> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:58:32 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 32a0063e-076e-4585-981e-fe853499aee3] VM Resumed (Lifecycle Event) Apr 21 10:58:32 user nova-compute[70954]: DEBUG nova.compute.manager [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Instance event wait completed in 0 seconds for {{(pid=70954) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 10:58:32 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Guest created on hypervisor {{(pid=70954) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 10:58:32 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Instance spawned successfully. Apr 21 10:58:32 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 10:58:32 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:58:32 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:58:32 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Found default for hw_cdrom_bus of ide {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:58:32 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Found default for hw_disk_bus of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:58:32 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Found default for hw_input_bus of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:58:32 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Found default for hw_pointer_model of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:58:32 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Found default for hw_video_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:58:32 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Found default for hw_vif_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:58:32 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 32a0063e-076e-4585-981e-fe853499aee3] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 10:58:32 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Started> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:58:32 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 32a0063e-076e-4585-981e-fe853499aee3] VM Started (Lifecycle Event) Apr 21 10:58:32 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:58:32 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:58:32 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 32a0063e-076e-4585-981e-fe853499aee3] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 10:58:32 user nova-compute[70954]: INFO nova.compute.manager [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Took 5.35 seconds to spawn the instance on the hypervisor. Apr 21 10:58:32 user nova-compute[70954]: DEBUG nova.compute.manager [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:58:32 user nova-compute[70954]: INFO nova.compute.manager [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Took 5.92 seconds to build instance. Apr 21 10:58:32 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-b560d37a-d6cc-4113-b53a-2db250bcd42d tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Lock "32a0063e-076e-4585-981e-fe853499aee3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.018s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:58:33 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:58:33 user nova-compute[70954]: DEBUG nova.compute.manager [req-ee468e2e-4741-4e78-b850-ba6865e3b88a req-0ea32abc-1c77-4fb3-b017-910307a902e5 service nova] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Received event network-vif-plugged-3e412892-1f09-4bcc-8126-bf7d69b8b2d2 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:58:33 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-ee468e2e-4741-4e78-b850-ba6865e3b88a req-0ea32abc-1c77-4fb3-b017-910307a902e5 service nova] Acquiring lock "32a0063e-076e-4585-981e-fe853499aee3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:58:33 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-ee468e2e-4741-4e78-b850-ba6865e3b88a req-0ea32abc-1c77-4fb3-b017-910307a902e5 service nova] Lock "32a0063e-076e-4585-981e-fe853499aee3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:58:33 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-ee468e2e-4741-4e78-b850-ba6865e3b88a req-0ea32abc-1c77-4fb3-b017-910307a902e5 service nova] Lock "32a0063e-076e-4585-981e-fe853499aee3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:58:33 user nova-compute[70954]: DEBUG nova.compute.manager [req-ee468e2e-4741-4e78-b850-ba6865e3b88a req-0ea32abc-1c77-4fb3-b017-910307a902e5 service nova] [instance: 32a0063e-076e-4585-981e-fe853499aee3] No waiting events found dispatching network-vif-plugged-3e412892-1f09-4bcc-8126-bf7d69b8b2d2 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:58:33 user nova-compute[70954]: WARNING nova.compute.manager [req-ee468e2e-4741-4e78-b850-ba6865e3b88a req-0ea32abc-1c77-4fb3-b017-910307a902e5 service nova] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Received unexpected event network-vif-plugged-3e412892-1f09-4bcc-8126-bf7d69b8b2d2 for instance with vm_state active and task_state None. Apr 21 10:58:34 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:58:39 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:58:39 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:58:39 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=70954) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 21 10:58:39 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 10:58:39 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 10:58:39 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:58:39 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Acquiring lock "332727a2-b516-40e2-9db1-460563e6ebd1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:58:39 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "332727a2-b516-40e2-9db1-460563e6ebd1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:58:39 user nova-compute[70954]: DEBUG nova.compute.manager [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Starting instance... {{(pid=70954) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 10:58:39 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:58:39 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:58:39 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70954) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 10:58:39 user nova-compute[70954]: INFO nova.compute.claims [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Claim successful on node user Apr 21 10:58:39 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:58:39 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:58:39 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.274s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:58:39 user nova-compute[70954]: DEBUG nova.compute.manager [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Start building networks asynchronously for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 10:58:39 user nova-compute[70954]: DEBUG nova.compute.manager [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Allocating IP information in the background. {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 10:58:39 user nova-compute[70954]: DEBUG nova.network.neutron [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] allocate_for_instance() {{(pid=70954) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 10:58:40 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 10:58:40 user nova-compute[70954]: DEBUG nova.compute.manager [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Start building block device mappings for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 10:58:40 user nova-compute[70954]: DEBUG nova.policy [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd7fc66871488428e9842404d885bcfe3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '14bc6b0c20204c8287b3523814007856', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70954) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 10:58:40 user nova-compute[70954]: DEBUG nova.compute.manager [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Start spawning the instance on the hypervisor. {{(pid=70954) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 10:58:40 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Creating instance directory {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 10:58:40 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Creating image(s) Apr 21 10:58:40 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Acquiring lock "/opt/stack/data/nova/instances/332727a2-b516-40e2-9db1-460563e6ebd1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:58:40 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "/opt/stack/data/nova/instances/332727a2-b516-40e2-9db1-460563e6ebd1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:58:40 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "/opt/stack/data/nova/instances/332727a2-b516-40e2-9db1-460563e6ebd1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:58:40 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:58:40 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.202s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:58:40 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Acquiring lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:58:40 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:58:40 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:58:40 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.141s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:58:40 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/332727a2-b516-40e2-9db1-460563e6ebd1/disk 1073741824 {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:58:40 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/332727a2-b516-40e2-9db1-460563e6ebd1/disk 1073741824" returned: 0 in 0.050s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:58:40 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.201s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:58:40 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:58:40 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.162s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:58:40 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Checking if we can resize image /opt/stack/data/nova/instances/332727a2-b516-40e2-9db1-460563e6ebd1/disk. size=1073741824 {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 10:58:40 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/332727a2-b516-40e2-9db1-460563e6ebd1/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:58:40 user nova-compute[70954]: DEBUG nova.network.neutron [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Successfully created port: 7d9b3006-2edc-475f-8387-86fc52c807f0 {{(pid=70954) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 10:58:41 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/332727a2-b516-40e2-9db1-460563e6ebd1/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:58:41 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Cannot resize image /opt/stack/data/nova/instances/332727a2-b516-40e2-9db1-460563e6ebd1/disk to a smaller size. {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 10:58:41 user nova-compute[70954]: DEBUG nova.objects.instance [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lazy-loading 'migration_context' on Instance uuid 332727a2-b516-40e2-9db1-460563e6ebd1 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:58:41 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Created local disks {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 10:58:41 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Ensure instance console log exists: /opt/stack/data/nova/instances/332727a2-b516-40e2-9db1-460563e6ebd1/console.log {{(pid=70954) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 10:58:41 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:58:41 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:58:41 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:58:41 user nova-compute[70954]: DEBUG nova.network.neutron [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Successfully updated port: 7d9b3006-2edc-475f-8387-86fc52c807f0 {{(pid=70954) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 10:58:41 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Acquiring lock "refresh_cache-332727a2-b516-40e2-9db1-460563e6ebd1" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:58:41 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Acquired lock "refresh_cache-332727a2-b516-40e2-9db1-460563e6ebd1" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:58:41 user nova-compute[70954]: DEBUG nova.network.neutron [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Building network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 10:58:41 user nova-compute[70954]: DEBUG nova.compute.manager [req-d8f97fd1-c0b4-41b6-97a0-4aee05d30310 req-cde82b48-6515-464b-9293-59ec66840044 service nova] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Received event network-changed-7d9b3006-2edc-475f-8387-86fc52c807f0 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:58:41 user nova-compute[70954]: DEBUG nova.compute.manager [req-d8f97fd1-c0b4-41b6-97a0-4aee05d30310 req-cde82b48-6515-464b-9293-59ec66840044 service nova] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Refreshing instance network info cache due to event network-changed-7d9b3006-2edc-475f-8387-86fc52c807f0. {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 10:58:41 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-d8f97fd1-c0b4-41b6-97a0-4aee05d30310 req-cde82b48-6515-464b-9293-59ec66840044 service nova] Acquiring lock "refresh_cache-332727a2-b516-40e2-9db1-460563e6ebd1" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:58:41 user nova-compute[70954]: DEBUG nova.network.neutron [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Instance cache missing network info. {{(pid=70954) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 10:58:42 user nova-compute[70954]: DEBUG nova.network.neutron [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Updating instance_info_cache with network_info: [{"id": "7d9b3006-2edc-475f-8387-86fc52c807f0", "address": "fa:16:3e:77:1f:26", "network": {"id": "e0ccd2d9-69df-40e0-be8e-8328039f1bd0", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-587901453-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "14bc6b0c20204c8287b3523814007856", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d9b3006-2e", "ovs_interfaceid": "7d9b3006-2edc-475f-8387-86fc52c807f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:58:42 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Releasing lock "refresh_cache-332727a2-b516-40e2-9db1-460563e6ebd1" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:58:42 user nova-compute[70954]: DEBUG nova.compute.manager [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Instance network_info: |[{"id": "7d9b3006-2edc-475f-8387-86fc52c807f0", "address": "fa:16:3e:77:1f:26", "network": {"id": "e0ccd2d9-69df-40e0-be8e-8328039f1bd0", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-587901453-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "14bc6b0c20204c8287b3523814007856", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d9b3006-2e", "ovs_interfaceid": "7d9b3006-2edc-475f-8387-86fc52c807f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 10:58:42 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-d8f97fd1-c0b4-41b6-97a0-4aee05d30310 req-cde82b48-6515-464b-9293-59ec66840044 service nova] Acquired lock "refresh_cache-332727a2-b516-40e2-9db1-460563e6ebd1" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:58:42 user nova-compute[70954]: DEBUG nova.network.neutron [req-d8f97fd1-c0b4-41b6-97a0-4aee05d30310 req-cde82b48-6515-464b-9293-59ec66840044 service nova] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Refreshing network info cache for port 7d9b3006-2edc-475f-8387-86fc52c807f0 {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 10:58:42 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Start _get_guest_xml network_info=[{"id": "7d9b3006-2edc-475f-8387-86fc52c807f0", "address": "fa:16:3e:77:1f:26", "network": {"id": "e0ccd2d9-69df-40e0-be8e-8328039f1bd0", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-587901453-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "14bc6b0c20204c8287b3523814007856", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d9b3006-2e", "ovs_interfaceid": "7d9b3006-2edc-475f-8387-86fc52c807f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:43:25Z,direct_url=,disk_format='qcow2',id=3b29a01a-1fc0-4d0d-89fb-23d22b2de02e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='a3109aa78f014d0da3638064a889676d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:43:26Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'size': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'disk_bus': 'virtio', 'encryption_format': None, 'guest_format': None, 'image_id': '3b29a01a-1fc0-4d0d-89fb-23d22b2de02e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 10:58:42 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:58:42 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:58:42 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70954) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 10:58:42 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T10:44:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:43:25Z,direct_url=,disk_format='qcow2',id=3b29a01a-1fc0-4d0d-89fb-23d22b2de02e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='a3109aa78f014d0da3638064a889676d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:43:26Z,virtual_size=,visibility=), allow threads: True {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 10:58:42 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Flavor limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 10:58:42 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Image limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 10:58:42 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Flavor pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 10:58:42 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Image pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 10:58:42 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 10:58:42 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 10:58:42 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 10:58:42 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Got 1 possible topologies {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 10:58:42 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 10:58:42 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 10:58:42 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:58:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-117564941',display_name='tempest-AttachVolumeNegativeTest-server-117564941',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-117564941',id=23,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFO8Apx3ELF1FQJTJmhq4XG7YL/DAo8z97ik4gGkKvut8Sus+PkfBnxQbqnCnfruRlDoOqDGvT630ViBsjv9qzGHLWN6zYB2m0AN9jyGLG1T5nwr0xbvXMEYFBE9kKO5cg==',key_name='tempest-keypair-2123760870',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='14bc6b0c20204c8287b3523814007856',ramdisk_id='',reservation_id='r-fed6e10h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-159654333',owner_user_name='tempest-AttachVolumeNegativeTest-159654333-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:58:40Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d7fc66871488428e9842404d885bcfe3',uuid=332727a2-b516-40e2-9db1-460563e6ebd1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7d9b3006-2edc-475f-8387-86fc52c807f0", "address": "fa:16:3e:77:1f:26", "network": {"id": "e0ccd2d9-69df-40e0-be8e-8328039f1bd0", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-587901453-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "14bc6b0c20204c8287b3523814007856", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d9b3006-2e", "ovs_interfaceid": "7d9b3006-2edc-475f-8387-86fc52c807f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70954) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 10:58:42 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Converting VIF {"id": "7d9b3006-2edc-475f-8387-86fc52c807f0", "address": "fa:16:3e:77:1f:26", "network": {"id": "e0ccd2d9-69df-40e0-be8e-8328039f1bd0", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-587901453-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "14bc6b0c20204c8287b3523814007856", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d9b3006-2e", "ovs_interfaceid": "7d9b3006-2edc-475f-8387-86fc52c807f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:58:42 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:77:1f:26,bridge_name='br-int',has_traffic_filtering=True,id=7d9b3006-2edc-475f-8387-86fc52c807f0,network=Network(e0ccd2d9-69df-40e0-be8e-8328039f1bd0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7d9b3006-2e') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:58:42 user nova-compute[70954]: DEBUG nova.objects.instance [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lazy-loading 'pci_devices' on Instance uuid 332727a2-b516-40e2-9db1-460563e6ebd1 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:58:42 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] End _get_guest_xml xml= Apr 21 10:58:42 user nova-compute[70954]: 332727a2-b516-40e2-9db1-460563e6ebd1 Apr 21 10:58:42 user nova-compute[70954]: instance-00000017 Apr 21 10:58:42 user nova-compute[70954]: 131072 Apr 21 10:58:42 user nova-compute[70954]: 1 Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: tempest-AttachVolumeNegativeTest-server-117564941 Apr 21 10:58:42 user nova-compute[70954]: 2023-04-21 10:58:42 Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: 128 Apr 21 10:58:42 user nova-compute[70954]: 1 Apr 21 10:58:42 user nova-compute[70954]: 0 Apr 21 10:58:42 user nova-compute[70954]: 0 Apr 21 10:58:42 user nova-compute[70954]: 1 Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: tempest-AttachVolumeNegativeTest-159654333-project-member Apr 21 10:58:42 user nova-compute[70954]: tempest-AttachVolumeNegativeTest-159654333 Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: OpenStack Foundation Apr 21 10:58:42 user nova-compute[70954]: OpenStack Nova Apr 21 10:58:42 user nova-compute[70954]: 0.0.0 Apr 21 10:58:42 user nova-compute[70954]: 332727a2-b516-40e2-9db1-460563e6ebd1 Apr 21 10:58:42 user nova-compute[70954]: 332727a2-b516-40e2-9db1-460563e6ebd1 Apr 21 10:58:42 user nova-compute[70954]: Virtual Machine Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: hvm Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Nehalem Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: /dev/urandom Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: Apr 21 10:58:42 user nova-compute[70954]: {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 10:58:42 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:58:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-117564941',display_name='tempest-AttachVolumeNegativeTest-server-117564941',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-117564941',id=23,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFO8Apx3ELF1FQJTJmhq4XG7YL/DAo8z97ik4gGkKvut8Sus+PkfBnxQbqnCnfruRlDoOqDGvT630ViBsjv9qzGHLWN6zYB2m0AN9jyGLG1T5nwr0xbvXMEYFBE9kKO5cg==',key_name='tempest-keypair-2123760870',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='14bc6b0c20204c8287b3523814007856',ramdisk_id='',reservation_id='r-fed6e10h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-159654333',owner_user_name='tempest-AttachVolumeNegativeTest-159654333-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:58:40Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d7fc66871488428e9842404d885bcfe3',uuid=332727a2-b516-40e2-9db1-460563e6ebd1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7d9b3006-2edc-475f-8387-86fc52c807f0", "address": "fa:16:3e:77:1f:26", "network": {"id": "e0ccd2d9-69df-40e0-be8e-8328039f1bd0", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-587901453-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "14bc6b0c20204c8287b3523814007856", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d9b3006-2e", "ovs_interfaceid": "7d9b3006-2edc-475f-8387-86fc52c807f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 10:58:42 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Converting VIF {"id": "7d9b3006-2edc-475f-8387-86fc52c807f0", "address": "fa:16:3e:77:1f:26", "network": {"id": "e0ccd2d9-69df-40e0-be8e-8328039f1bd0", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-587901453-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "14bc6b0c20204c8287b3523814007856", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d9b3006-2e", "ovs_interfaceid": "7d9b3006-2edc-475f-8387-86fc52c807f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:58:42 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:77:1f:26,bridge_name='br-int',has_traffic_filtering=True,id=7d9b3006-2edc-475f-8387-86fc52c807f0,network=Network(e0ccd2d9-69df-40e0-be8e-8328039f1bd0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7d9b3006-2e') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:58:42 user nova-compute[70954]: DEBUG os_vif [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:1f:26,bridge_name='br-int',has_traffic_filtering=True,id=7d9b3006-2edc-475f-8387-86fc52c807f0,network=Network(e0ccd2d9-69df-40e0-be8e-8328039f1bd0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7d9b3006-2e') {{(pid=70954) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 10:58:42 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:58:42 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:58:42 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 10:58:42 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:58:42 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7d9b3006-2e, may_exist=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:58:42 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7d9b3006-2e, col_values=(('external_ids', {'iface-id': '7d9b3006-2edc-475f-8387-86fc52c807f0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:77:1f:26', 'vm-uuid': '332727a2-b516-40e2-9db1-460563e6ebd1'}),)) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 10:58:42 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:58:42 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:58:42 user nova-compute[70954]: INFO os_vif [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:1f:26,bridge_name='br-int',has_traffic_filtering=True,id=7d9b3006-2edc-475f-8387-86fc52c807f0,network=Network(e0ccd2d9-69df-40e0-be8e-8328039f1bd0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7d9b3006-2e') Apr 21 10:58:42 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] No BDM found with device name vda, not building metadata. {{(pid=70954) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 10:58:42 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] No VIF found with MAC fa:16:3e:77:1f:26, not building metadata {{(pid=70954) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 10:58:42 user nova-compute[70954]: DEBUG nova.network.neutron [req-d8f97fd1-c0b4-41b6-97a0-4aee05d30310 req-cde82b48-6515-464b-9293-59ec66840044 service nova] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Updated VIF entry in instance network info cache for port 7d9b3006-2edc-475f-8387-86fc52c807f0. {{(pid=70954) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 10:58:42 user nova-compute[70954]: DEBUG nova.network.neutron [req-d8f97fd1-c0b4-41b6-97a0-4aee05d30310 req-cde82b48-6515-464b-9293-59ec66840044 service nova] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Updating instance_info_cache with network_info: [{"id": "7d9b3006-2edc-475f-8387-86fc52c807f0", "address": "fa:16:3e:77:1f:26", "network": {"id": "e0ccd2d9-69df-40e0-be8e-8328039f1bd0", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-587901453-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "14bc6b0c20204c8287b3523814007856", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d9b3006-2e", "ovs_interfaceid": "7d9b3006-2edc-475f-8387-86fc52c807f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:58:42 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-d8f97fd1-c0b4-41b6-97a0-4aee05d30310 req-cde82b48-6515-464b-9293-59ec66840044 service nova] Releasing lock "refresh_cache-332727a2-b516-40e2-9db1-460563e6ebd1" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:58:43 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:58:43 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:58:43 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:58:43 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:58:43 user nova-compute[70954]: DEBUG nova.compute.manager [req-03378451-870f-4180-82d9-adb2f4c87dd8 req-6d3ba26a-2407-47b6-9834-18a8ea1de05d service nova] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Received event network-vif-plugged-7d9b3006-2edc-475f-8387-86fc52c807f0 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:58:43 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-03378451-870f-4180-82d9-adb2f4c87dd8 req-6d3ba26a-2407-47b6-9834-18a8ea1de05d service nova] Acquiring lock "332727a2-b516-40e2-9db1-460563e6ebd1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:58:43 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-03378451-870f-4180-82d9-adb2f4c87dd8 req-6d3ba26a-2407-47b6-9834-18a8ea1de05d service nova] Lock "332727a2-b516-40e2-9db1-460563e6ebd1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:58:43 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-03378451-870f-4180-82d9-adb2f4c87dd8 req-6d3ba26a-2407-47b6-9834-18a8ea1de05d service nova] Lock "332727a2-b516-40e2-9db1-460563e6ebd1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:58:43 user nova-compute[70954]: DEBUG nova.compute.manager [req-03378451-870f-4180-82d9-adb2f4c87dd8 req-6d3ba26a-2407-47b6-9834-18a8ea1de05d service nova] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] No waiting events found dispatching network-vif-plugged-7d9b3006-2edc-475f-8387-86fc52c807f0 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:58:43 user nova-compute[70954]: WARNING nova.compute.manager [req-03378451-870f-4180-82d9-adb2f4c87dd8 req-6d3ba26a-2407-47b6-9834-18a8ea1de05d service nova] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Received unexpected event network-vif-plugged-7d9b3006-2edc-475f-8387-86fc52c807f0 for instance with vm_state building and task_state spawning. Apr 21 10:58:44 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:58:44 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:58:44 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:58:44 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:58:44 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:58:45 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Resumed> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:58:45 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] VM Resumed (Lifecycle Event) Apr 21 10:58:45 user nova-compute[70954]: DEBUG nova.compute.manager [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Instance event wait completed in 0 seconds for {{(pid=70954) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 10:58:45 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Guest created on hypervisor {{(pid=70954) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 10:58:45 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Instance spawned successfully. Apr 21 10:58:45 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 10:58:45 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:58:45 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:58:45 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Found default for hw_cdrom_bus of ide {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:58:45 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Found default for hw_disk_bus of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:58:45 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Found default for hw_input_bus of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:58:45 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Found default for hw_pointer_model of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:58:45 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Found default for hw_video_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:58:45 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Found default for hw_vif_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 10:58:45 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 10:58:45 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Started> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:58:45 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] VM Started (Lifecycle Event) Apr 21 10:58:45 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:58:45 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:58:45 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 10:58:45 user nova-compute[70954]: DEBUG nova.compute.manager [req-70de06cb-5aa3-4dad-8aa7-77c89c567042 req-a12ce27d-2e48-4f2e-a13e-db86828c8971 service nova] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Received event network-vif-plugged-7d9b3006-2edc-475f-8387-86fc52c807f0 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:58:45 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-70de06cb-5aa3-4dad-8aa7-77c89c567042 req-a12ce27d-2e48-4f2e-a13e-db86828c8971 service nova] Acquiring lock "332727a2-b516-40e2-9db1-460563e6ebd1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:58:45 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-70de06cb-5aa3-4dad-8aa7-77c89c567042 req-a12ce27d-2e48-4f2e-a13e-db86828c8971 service nova] Lock "332727a2-b516-40e2-9db1-460563e6ebd1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:58:45 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-70de06cb-5aa3-4dad-8aa7-77c89c567042 req-a12ce27d-2e48-4f2e-a13e-db86828c8971 service nova] Lock "332727a2-b516-40e2-9db1-460563e6ebd1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:58:45 user nova-compute[70954]: DEBUG nova.compute.manager [req-70de06cb-5aa3-4dad-8aa7-77c89c567042 req-a12ce27d-2e48-4f2e-a13e-db86828c8971 service nova] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] No waiting events found dispatching network-vif-plugged-7d9b3006-2edc-475f-8387-86fc52c807f0 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:58:45 user nova-compute[70954]: WARNING nova.compute.manager [req-70de06cb-5aa3-4dad-8aa7-77c89c567042 req-a12ce27d-2e48-4f2e-a13e-db86828c8971 service nova] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Received unexpected event network-vif-plugged-7d9b3006-2edc-475f-8387-86fc52c807f0 for instance with vm_state building and task_state spawning. Apr 21 10:58:45 user nova-compute[70954]: INFO nova.compute.manager [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Took 5.55 seconds to spawn the instance on the hypervisor. Apr 21 10:58:45 user nova-compute[70954]: DEBUG nova.compute.manager [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:58:45 user nova-compute[70954]: INFO nova.compute.manager [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Took 6.41 seconds to build instance. Apr 21 10:58:46 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-e2892ad4-7d81-49cf-a3f6-51b8793953f9 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "332727a2-b516-40e2-9db1-460563e6ebd1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.504s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:58:47 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:58:48 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:58:52 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:58:53 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:58:57 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:58:58 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:59:01 user nova-compute[70954]: INFO nova.compute.manager [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Rescuing Apr 21 10:59:01 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Acquiring lock "refresh_cache-d1fca309-1d26-4a34-b932-716064b86b00" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:59:01 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Acquired lock "refresh_cache-d1fca309-1d26-4a34-b932-716064b86b00" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:59:01 user nova-compute[70954]: DEBUG nova.network.neutron [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Building network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 10:59:01 user nova-compute[70954]: DEBUG nova.network.neutron [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Updating instance_info_cache with network_info: [{"id": "8abc9260-fa02-4915-a056-63262b57e3be", "address": "fa:16:3e:8d:97:64", "network": {"id": "72777f52-fe61-4f05-b2c6-5edb74fb3138", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1022240465-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c0a611b8a8d54522929c37807054b2f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8abc9260-fa", "ovs_interfaceid": "8abc9260-fa02-4915-a056-63262b57e3be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:59:01 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Releasing lock "refresh_cache-d1fca309-1d26-4a34-b932-716064b86b00" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:59:01 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:59:02 user nova-compute[70954]: DEBUG nova.compute.manager [req-9dd6c270-b87f-4dee-9ae9-cdac22e93f0b req-d88ceb91-05bf-400f-b30d-4ebf697683c8 service nova] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Received event network-vif-unplugged-8abc9260-fa02-4915-a056-63262b57e3be {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:59:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-9dd6c270-b87f-4dee-9ae9-cdac22e93f0b req-d88ceb91-05bf-400f-b30d-4ebf697683c8 service nova] Acquiring lock "d1fca309-1d26-4a34-b932-716064b86b00-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:59:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-9dd6c270-b87f-4dee-9ae9-cdac22e93f0b req-d88ceb91-05bf-400f-b30d-4ebf697683c8 service nova] Lock "d1fca309-1d26-4a34-b932-716064b86b00-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:59:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-9dd6c270-b87f-4dee-9ae9-cdac22e93f0b req-d88ceb91-05bf-400f-b30d-4ebf697683c8 service nova] Lock "d1fca309-1d26-4a34-b932-716064b86b00-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:59:02 user nova-compute[70954]: DEBUG nova.compute.manager [req-9dd6c270-b87f-4dee-9ae9-cdac22e93f0b req-d88ceb91-05bf-400f-b30d-4ebf697683c8 service nova] [instance: d1fca309-1d26-4a34-b932-716064b86b00] No waiting events found dispatching network-vif-unplugged-8abc9260-fa02-4915-a056-63262b57e3be {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:59:02 user nova-compute[70954]: WARNING nova.compute.manager [req-9dd6c270-b87f-4dee-9ae9-cdac22e93f0b req-d88ceb91-05bf-400f-b30d-4ebf697683c8 service nova] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Received unexpected event network-vif-unplugged-8abc9260-fa02-4915-a056-63262b57e3be for instance with vm_state active and task_state rescuing. Apr 21 10:59:02 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:59:02 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:59:02 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:59:02 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Instance destroyed successfully. Apr 21 10:59:02 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Attempting rescue Apr 21 10:59:02 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} {{(pid=70954) rescue /opt/stack/nova/nova/virt/libvirt/driver.py:4289}} Apr 21 10:59:02 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Instance directory exists: not creating {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4694}} Apr 21 10:59:02 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Creating image(s) Apr 21 10:59:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Acquiring lock "/opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:59:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lock "/opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:59:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lock "/opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:59:02 user nova-compute[70954]: DEBUG nova.objects.instance [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lazy-loading 'trusted_certs' on Instance uuid d1fca309-1d26-4a34-b932-716064b86b00 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:59:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Acquiring lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:59:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:59:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:59:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.139s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:59:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00/disk.rescue {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:59:02 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00/disk.rescue" returned: 0 in 0.054s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:59:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.199s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:59:02 user nova-compute[70954]: DEBUG nova.objects.instance [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lazy-loading 'migration_context' on Instance uuid d1fca309-1d26-4a34-b932-716064b86b00 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:59:02 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Created local disks {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 10:59:02 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Start _get_guest_xml network_info=[{"id": "8abc9260-fa02-4915-a056-63262b57e3be", "address": "fa:16:3e:8d:97:64", "network": {"id": "72777f52-fe61-4f05-b2c6-5edb74fb3138", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1022240465-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-1022240465-network", "vif_mac": "fa:16:3e:8d:97:64"}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c0a611b8a8d54522929c37807054b2f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8abc9260-fa", "ovs_interfaceid": "8abc9260-fa02-4915-a056-63262b57e3be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:43:25Z,direct_url=,disk_format='qcow2',id=3b29a01a-1fc0-4d0d-89fb-23d22b2de02e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='a3109aa78f014d0da3638064a889676d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:43:26Z,virtual_size=,visibility=) rescue={'image_id': '3b29a01a-1fc0-4d0d-89fb-23d22b2de02e', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 10:59:02 user nova-compute[70954]: DEBUG nova.objects.instance [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lazy-loading 'resources' on Instance uuid d1fca309-1d26-4a34-b932-716064b86b00 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:59:02 user nova-compute[70954]: DEBUG nova.objects.instance [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lazy-loading 'numa_topology' on Instance uuid d1fca309-1d26-4a34-b932-716064b86b00 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:59:02 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:59:02 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:59:02 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70954) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 10:59:02 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T10:44:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:43:25Z,direct_url=,disk_format='qcow2',id=3b29a01a-1fc0-4d0d-89fb-23d22b2de02e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='a3109aa78f014d0da3638064a889676d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:43:26Z,virtual_size=,visibility=), allow threads: True {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 10:59:02 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Flavor limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 10:59:02 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Image limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 10:59:02 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Flavor pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 10:59:02 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Image pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 10:59:02 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 10:59:02 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 10:59:02 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 10:59:02 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Got 1 possible topologies {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 10:59:02 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 10:59:02 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 10:59:02 user nova-compute[70954]: DEBUG nova.objects.instance [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lazy-loading 'vcpu_model' on Instance uuid d1fca309-1d26-4a34-b932-716064b86b00 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:59:02 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:57:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-6822012',display_name='tempest-ServerRescueNegativeTestJSON-server-6822012',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-6822012',id=21,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-21T10:57:18Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='c0a611b8a8d54522929c37807054b2f6',ramdisk_id='',reservation_id='r-cfw0zm0m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerRescueNegativeTestJSON-1656706265',owner_user_name='tempest-ServerRescueNegativeTestJSON-1656706265-project-member'},tags=,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T10:57:19Z,user_data=None,user_id='0f73ac02062c4411bde0c97f6a719926',uuid=d1fca309-1d26-4a34-b932-716064b86b00,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8abc9260-fa02-4915-a056-63262b57e3be", "address": "fa:16:3e:8d:97:64", "network": {"id": "72777f52-fe61-4f05-b2c6-5edb74fb3138", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1022240465-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-1022240465-network", "vif_mac": "fa:16:3e:8d:97:64"}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c0a611b8a8d54522929c37807054b2f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8abc9260-fa", "ovs_interfaceid": "8abc9260-fa02-4915-a056-63262b57e3be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70954) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 10:59:02 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Converting VIF {"id": "8abc9260-fa02-4915-a056-63262b57e3be", "address": "fa:16:3e:8d:97:64", "network": {"id": "72777f52-fe61-4f05-b2c6-5edb74fb3138", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1022240465-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-1022240465-network", "vif_mac": "fa:16:3e:8d:97:64"}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c0a611b8a8d54522929c37807054b2f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8abc9260-fa", "ovs_interfaceid": "8abc9260-fa02-4915-a056-63262b57e3be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 10:59:02 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8d:97:64,bridge_name='br-int',has_traffic_filtering=True,id=8abc9260-fa02-4915-a056-63262b57e3be,network=Network(72777f52-fe61-4f05-b2c6-5edb74fb3138),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8abc9260-fa') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 10:59:02 user nova-compute[70954]: DEBUG nova.objects.instance [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lazy-loading 'pci_devices' on Instance uuid d1fca309-1d26-4a34-b932-716064b86b00 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 10:59:02 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] End _get_guest_xml xml= Apr 21 10:59:02 user nova-compute[70954]: d1fca309-1d26-4a34-b932-716064b86b00 Apr 21 10:59:02 user nova-compute[70954]: instance-00000015 Apr 21 10:59:02 user nova-compute[70954]: 131072 Apr 21 10:59:02 user nova-compute[70954]: 1 Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: tempest-ServerRescueNegativeTestJSON-server-6822012 Apr 21 10:59:02 user nova-compute[70954]: 2023-04-21 10:59:02 Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: 128 Apr 21 10:59:02 user nova-compute[70954]: 1 Apr 21 10:59:02 user nova-compute[70954]: 0 Apr 21 10:59:02 user nova-compute[70954]: 0 Apr 21 10:59:02 user nova-compute[70954]: 1 Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: tempest-ServerRescueNegativeTestJSON-1656706265-project-member Apr 21 10:59:02 user nova-compute[70954]: tempest-ServerRescueNegativeTestJSON-1656706265 Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: OpenStack Foundation Apr 21 10:59:02 user nova-compute[70954]: OpenStack Nova Apr 21 10:59:02 user nova-compute[70954]: 0.0.0 Apr 21 10:59:02 user nova-compute[70954]: d1fca309-1d26-4a34-b932-716064b86b00 Apr 21 10:59:02 user nova-compute[70954]: d1fca309-1d26-4a34-b932-716064b86b00 Apr 21 10:59:02 user nova-compute[70954]: Virtual Machine Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: hvm Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Nehalem Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: /dev/urandom Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: Apr 21 10:59:02 user nova-compute[70954]: {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 10:59:02 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Instance destroyed successfully. Apr 21 10:59:02 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] No BDM found with device name vda, not building metadata. {{(pid=70954) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 10:59:02 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] No BDM found with device name vdb, not building metadata. {{(pid=70954) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 10:59:02 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] No VIF found with MAC fa:16:3e:8d:97:64, not building metadata {{(pid=70954) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 10:59:03 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:59:04 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:59:04 user nova-compute[70954]: DEBUG nova.compute.manager [req-b555fade-9d8a-45df-b707-b6db6c4a6499 req-f4dd6db5-3b10-4baf-a11a-4c01c55333d2 service nova] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Received event network-vif-plugged-8abc9260-fa02-4915-a056-63262b57e3be {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:59:04 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-b555fade-9d8a-45df-b707-b6db6c4a6499 req-f4dd6db5-3b10-4baf-a11a-4c01c55333d2 service nova] Acquiring lock "d1fca309-1d26-4a34-b932-716064b86b00-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:59:04 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-b555fade-9d8a-45df-b707-b6db6c4a6499 req-f4dd6db5-3b10-4baf-a11a-4c01c55333d2 service nova] Lock "d1fca309-1d26-4a34-b932-716064b86b00-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:59:04 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-b555fade-9d8a-45df-b707-b6db6c4a6499 req-f4dd6db5-3b10-4baf-a11a-4c01c55333d2 service nova] Lock "d1fca309-1d26-4a34-b932-716064b86b00-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:59:04 user nova-compute[70954]: DEBUG nova.compute.manager [req-b555fade-9d8a-45df-b707-b6db6c4a6499 req-f4dd6db5-3b10-4baf-a11a-4c01c55333d2 service nova] [instance: d1fca309-1d26-4a34-b932-716064b86b00] No waiting events found dispatching network-vif-plugged-8abc9260-fa02-4915-a056-63262b57e3be {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:59:04 user nova-compute[70954]: WARNING nova.compute.manager [req-b555fade-9d8a-45df-b707-b6db6c4a6499 req-f4dd6db5-3b10-4baf-a11a-4c01c55333d2 service nova] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Received unexpected event network-vif-plugged-8abc9260-fa02-4915-a056-63262b57e3be for instance with vm_state active and task_state rescuing. Apr 21 10:59:04 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:59:04 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:59:05 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:59:05 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Starting heal instance info cache {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 10:59:05 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "refresh_cache-476dbf2e-b02a-47bc-a8c6-6d0d66d5d433" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 10:59:05 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquired lock "refresh_cache-476dbf2e-b02a-47bc-a8c6-6d0d66d5d433" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 10:59:05 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Forcefully refreshing network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 21 10:59:06 user nova-compute[70954]: DEBUG nova.compute.manager [req-e19bb618-d622-4dd3-bd5b-cfd9269c4444 req-4153f707-0a43-444a-8fcd-782428194cbb service nova] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Received event network-vif-plugged-8abc9260-fa02-4915-a056-63262b57e3be {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:59:06 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-e19bb618-d622-4dd3-bd5b-cfd9269c4444 req-4153f707-0a43-444a-8fcd-782428194cbb service nova] Acquiring lock "d1fca309-1d26-4a34-b932-716064b86b00-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:59:06 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-e19bb618-d622-4dd3-bd5b-cfd9269c4444 req-4153f707-0a43-444a-8fcd-782428194cbb service nova] Lock "d1fca309-1d26-4a34-b932-716064b86b00-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:59:06 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-e19bb618-d622-4dd3-bd5b-cfd9269c4444 req-4153f707-0a43-444a-8fcd-782428194cbb service nova] Lock "d1fca309-1d26-4a34-b932-716064b86b00-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:59:06 user nova-compute[70954]: DEBUG nova.compute.manager [req-e19bb618-d622-4dd3-bd5b-cfd9269c4444 req-4153f707-0a43-444a-8fcd-782428194cbb service nova] [instance: d1fca309-1d26-4a34-b932-716064b86b00] No waiting events found dispatching network-vif-plugged-8abc9260-fa02-4915-a056-63262b57e3be {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:59:06 user nova-compute[70954]: WARNING nova.compute.manager [req-e19bb618-d622-4dd3-bd5b-cfd9269c4444 req-4153f707-0a43-444a-8fcd-782428194cbb service nova] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Received unexpected event network-vif-plugged-8abc9260-fa02-4915-a056-63262b57e3be for instance with vm_state active and task_state rescuing. Apr 21 10:59:06 user nova-compute[70954]: DEBUG nova.compute.manager [req-e19bb618-d622-4dd3-bd5b-cfd9269c4444 req-4153f707-0a43-444a-8fcd-782428194cbb service nova] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Received event network-vif-plugged-8abc9260-fa02-4915-a056-63262b57e3be {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 10:59:06 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-e19bb618-d622-4dd3-bd5b-cfd9269c4444 req-4153f707-0a43-444a-8fcd-782428194cbb service nova] Acquiring lock "d1fca309-1d26-4a34-b932-716064b86b00-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:59:06 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-e19bb618-d622-4dd3-bd5b-cfd9269c4444 req-4153f707-0a43-444a-8fcd-782428194cbb service nova] Lock "d1fca309-1d26-4a34-b932-716064b86b00-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:59:06 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-e19bb618-d622-4dd3-bd5b-cfd9269c4444 req-4153f707-0a43-444a-8fcd-782428194cbb service nova] Lock "d1fca309-1d26-4a34-b932-716064b86b00-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:59:06 user nova-compute[70954]: DEBUG nova.compute.manager [req-e19bb618-d622-4dd3-bd5b-cfd9269c4444 req-4153f707-0a43-444a-8fcd-782428194cbb service nova] [instance: d1fca309-1d26-4a34-b932-716064b86b00] No waiting events found dispatching network-vif-plugged-8abc9260-fa02-4915-a056-63262b57e3be {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 10:59:06 user nova-compute[70954]: WARNING nova.compute.manager [req-e19bb618-d622-4dd3-bd5b-cfd9269c4444 req-4153f707-0a43-444a-8fcd-782428194cbb service nova] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Received unexpected event network-vif-plugged-8abc9260-fa02-4915-a056-63262b57e3be for instance with vm_state active and task_state rescuing. Apr 21 10:59:06 user nova-compute[70954]: DEBUG nova.virt.libvirt.host [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Removed pending event for d1fca309-1d26-4a34-b932-716064b86b00 due to event {{(pid=70954) _event_emit_delayed /opt/stack/nova/nova/virt/libvirt/host.py:438}} Apr 21 10:59:06 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Resumed> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:59:06 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: d1fca309-1d26-4a34-b932-716064b86b00] VM Resumed (Lifecycle Event) Apr 21 10:59:06 user nova-compute[70954]: DEBUG nova.compute.manager [None req-5cd8c589-962a-4387-8b2a-c20c34fc95b0 tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:59:06 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:59:06 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:59:06 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: d1fca309-1d26-4a34-b932-716064b86b00] During sync_power_state the instance has a pending task (rescuing). Skip. Apr 21 10:59:06 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Started> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 10:59:06 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: d1fca309-1d26-4a34-b932-716064b86b00] VM Started (Lifecycle Event) Apr 21 10:59:06 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 10:59:06 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 10:59:06 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Updating instance_info_cache with network_info: [{"id": "fb82372c-8c1a-43e8-9eba-5f5469b8ac66", "address": "fa:16:3e:d7:18:9c", "network": {"id": "72777f52-fe61-4f05-b2c6-5edb74fb3138", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1022240465-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c0a611b8a8d54522929c37807054b2f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb82372c-8c", "ovs_interfaceid": "fb82372c-8c1a-43e8-9eba-5f5469b8ac66", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 10:59:06 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Releasing lock "refresh_cache-476dbf2e-b02a-47bc-a8c6-6d0d66d5d433" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 10:59:06 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Updated the network info_cache for instance {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 21 10:59:07 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:59:07 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:59:08 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:59:08 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:59:08 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager.update_available_resource {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:59:08 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:59:08 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:59:08 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:59:08 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Auditing locally available compute resources for user (node: user) {{(pid=70954) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 10:59:08 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00/disk.rescue --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:59:09 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00/disk.rescue --force-share --output=json" returned: 0 in 0.145s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:59:09 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00/disk.rescue --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:59:09 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00/disk.rescue --force-share --output=json" returned: 0 in 0.135s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:59:09 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:59:09 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:59:09 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:59:09 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:59:09 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/32a0063e-076e-4585-981e-fe853499aee3/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:59:09 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/32a0063e-076e-4585-981e-fe853499aee3/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:59:09 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/32a0063e-076e-4585-981e-fe853499aee3/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:59:09 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/32a0063e-076e-4585-981e-fe853499aee3/disk --force-share --output=json" returned: 0 in 0.157s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:59:09 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/332727a2-b516-40e2-9db1-460563e6ebd1/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:59:10 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/332727a2-b516-40e2-9db1-460563e6ebd1/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:59:10 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/332727a2-b516-40e2-9db1-460563e6ebd1/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:59:10 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/332727a2-b516-40e2-9db1-460563e6ebd1/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:59:10 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:59:10 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:59:10 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:59:10 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:59:10 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/476dbf2e-b02a-47bc-a8c6-6d0d66d5d433/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:59:10 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/476dbf2e-b02a-47bc-a8c6-6d0d66d5d433/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:59:10 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/476dbf2e-b02a-47bc-a8c6-6d0d66d5d433/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 10:59:10 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/476dbf2e-b02a-47bc-a8c6-6d0d66d5d433/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 10:59:11 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:59:11 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 10:59:11 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Hypervisor/Node resource view: name=user free_ram=8668MB free_disk=26.47055435180664GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70954) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 10:59:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 10:59:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 10:59:11 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 595d41a4-9a01-4aa2-96a1-c2c763475184 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:59:11 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:59:11 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance d1fca309-1d26-4a34-b932-716064b86b00 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:59:11 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 32a0063e-076e-4585-981e-fe853499aee3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:59:11 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 332727a2-b516-40e2-9db1-460563e6ebd1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 10:59:11 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Total usable vcpus: 12, total allocated vcpus: 5 {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 10:59:11 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Final resource view: name=user phys_ram=16023MB used_ram=1152MB phys_disk=40GB used_disk=5GB total_vcpus=12 used_vcpus=5 pci_stats=[] {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 10:59:11 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Refreshing inventories for resource provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 21 10:59:11 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Updating ProviderTree inventory for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 21 10:59:11 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Updating inventory in ProviderTree for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 21 10:59:11 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Refreshing aggregate associations for resource provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97, aggregates: None {{(pid=70954) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 21 10:59:11 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Refreshing trait associations for resource provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97, traits: COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,HW_CPU_X86_SSE,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_USB,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT {{(pid=70954) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 21 10:59:11 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 10:59:11 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 10:59:11 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Compute_service record updated for user:user {{(pid=70954) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 10:59:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.476s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 10:59:12 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:59:12 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:59:12 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:59:12 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:59:12 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:59:13 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:59:14 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:59:16 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 10:59:16 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70954) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 10:59:17 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:59:18 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:59:22 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:59:23 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:59:27 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:59:28 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:59:32 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:59:33 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:59:37 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:59:42 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:59:47 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 10:59:48 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:59:52 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:59:53 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 10:59:57 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:00:02 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:00:02 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:00:02 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=70954) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 21 11:00:02 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 11:00:02 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 11:00:02 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:00:06 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:00:06 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Starting heal instance info cache {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 11:00:06 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "refresh_cache-d1fca309-1d26-4a34-b932-716064b86b00" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 11:00:06 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquired lock "refresh_cache-d1fca309-1d26-4a34-b932-716064b86b00" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 11:00:06 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Forcefully refreshing network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 21 11:00:07 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:00:07 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Updating instance_info_cache with network_info: [{"id": "8abc9260-fa02-4915-a056-63262b57e3be", "address": "fa:16:3e:8d:97:64", "network": {"id": "72777f52-fe61-4f05-b2c6-5edb74fb3138", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1022240465-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c0a611b8a8d54522929c37807054b2f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8abc9260-fa", "ovs_interfaceid": "8abc9260-fa02-4915-a056-63262b57e3be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 11:00:07 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Releasing lock "refresh_cache-d1fca309-1d26-4a34-b932-716064b86b00" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 11:00:07 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Updated the network info_cache for instance {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 21 11:00:08 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:00:08 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:00:08 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager.update_available_resource {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:00:08 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:00:08 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:00:08 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:00:08 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Auditing locally available compute resources for user (node: user) {{(pid=70954) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 11:00:08 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00/disk.rescue --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:00:09 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00/disk.rescue --force-share --output=json" returned: 0 in 0.133s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:00:09 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00/disk.rescue --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:00:09 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00/disk.rescue --force-share --output=json" returned: 0 in 0.132s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:00:09 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:00:09 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:00:09 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:00:09 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:00:09 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/32a0063e-076e-4585-981e-fe853499aee3/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:00:09 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/32a0063e-076e-4585-981e-fe853499aee3/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:00:09 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/32a0063e-076e-4585-981e-fe853499aee3/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:00:09 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/32a0063e-076e-4585-981e-fe853499aee3/disk --force-share --output=json" returned: 0 in 0.129s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:00:09 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/332727a2-b516-40e2-9db1-460563e6ebd1/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:00:09 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/332727a2-b516-40e2-9db1-460563e6ebd1/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:00:09 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/332727a2-b516-40e2-9db1-460563e6ebd1/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:00:10 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/332727a2-b516-40e2-9db1-460563e6ebd1/disk --force-share --output=json" returned: 0 in 0.149s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:00:10 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:00:10 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184/disk --force-share --output=json" returned: 0 in 0.144s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:00:10 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:00:10 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:00:10 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/476dbf2e-b02a-47bc-a8c6-6d0d66d5d433/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:00:10 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/476dbf2e-b02a-47bc-a8c6-6d0d66d5d433/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:00:10 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/476dbf2e-b02a-47bc-a8c6-6d0d66d5d433/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:00:10 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/476dbf2e-b02a-47bc-a8c6-6d0d66d5d433/disk --force-share --output=json" returned: 0 in 0.129s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:00:11 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 11:00:11 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 11:00:11 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Hypervisor/Node resource view: name=user free_ram=8737MB free_disk=26.45102310180664GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70954) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 11:00:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:00:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:00:11 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 595d41a4-9a01-4aa2-96a1-c2c763475184 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 11:00:11 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 11:00:11 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance d1fca309-1d26-4a34-b932-716064b86b00 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 11:00:11 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 32a0063e-076e-4585-981e-fe853499aee3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 11:00:11 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 332727a2-b516-40e2-9db1-460563e6ebd1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 11:00:11 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Total usable vcpus: 12, total allocated vcpus: 5 {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 11:00:11 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Final resource view: name=user phys_ram=16023MB used_ram=1152MB phys_disk=40GB used_disk=5GB total_vcpus=12 used_vcpus=5 pci_stats=[] {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 11:00:11 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 11:00:11 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 11:00:11 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Compute_service record updated for user:user {{(pid=70954) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 11:00:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.301s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:00:12 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:00:13 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:00:13 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:00:13 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:00:13 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:00:14 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:00:16 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-8a3a11d9-83b1-4471-be02-6c2c73e4ee8b tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Acquiring lock "32a0063e-076e-4585-981e-fe853499aee3" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:00:16 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-8a3a11d9-83b1-4471-be02-6c2c73e4ee8b tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Lock "32a0063e-076e-4585-981e-fe853499aee3" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:00:16 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-8a3a11d9-83b1-4471-be02-6c2c73e4ee8b tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Acquiring lock "32a0063e-076e-4585-981e-fe853499aee3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:00:16 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-8a3a11d9-83b1-4471-be02-6c2c73e4ee8b tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Lock "32a0063e-076e-4585-981e-fe853499aee3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:00:16 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-8a3a11d9-83b1-4471-be02-6c2c73e4ee8b tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Lock "32a0063e-076e-4585-981e-fe853499aee3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:00:16 user nova-compute[70954]: INFO nova.compute.manager [None req-8a3a11d9-83b1-4471-be02-6c2c73e4ee8b tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Terminating instance Apr 21 11:00:16 user nova-compute[70954]: DEBUG nova.compute.manager [None req-8a3a11d9-83b1-4471-be02-6c2c73e4ee8b tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Start destroying the instance on the hypervisor. {{(pid=70954) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 11:00:16 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:00:16 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:00:17 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:00:17 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:00:17 user nova-compute[70954]: DEBUG nova.compute.manager [req-4601ea9c-5f9e-4dc3-b786-7ebd490a4559 req-9a101b5e-76a6-440f-a43e-deed0cd4e2c1 service nova] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Received event network-vif-unplugged-3e412892-1f09-4bcc-8126-bf7d69b8b2d2 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 11:00:17 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-4601ea9c-5f9e-4dc3-b786-7ebd490a4559 req-9a101b5e-76a6-440f-a43e-deed0cd4e2c1 service nova] Acquiring lock "32a0063e-076e-4585-981e-fe853499aee3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:00:17 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-4601ea9c-5f9e-4dc3-b786-7ebd490a4559 req-9a101b5e-76a6-440f-a43e-deed0cd4e2c1 service nova] Lock "32a0063e-076e-4585-981e-fe853499aee3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:00:17 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-4601ea9c-5f9e-4dc3-b786-7ebd490a4559 req-9a101b5e-76a6-440f-a43e-deed0cd4e2c1 service nova] Lock "32a0063e-076e-4585-981e-fe853499aee3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:00:17 user nova-compute[70954]: DEBUG nova.compute.manager [req-4601ea9c-5f9e-4dc3-b786-7ebd490a4559 req-9a101b5e-76a6-440f-a43e-deed0cd4e2c1 service nova] [instance: 32a0063e-076e-4585-981e-fe853499aee3] No waiting events found dispatching network-vif-unplugged-3e412892-1f09-4bcc-8126-bf7d69b8b2d2 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 11:00:17 user nova-compute[70954]: DEBUG nova.compute.manager [req-4601ea9c-5f9e-4dc3-b786-7ebd490a4559 req-9a101b5e-76a6-440f-a43e-deed0cd4e2c1 service nova] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Received event network-vif-unplugged-3e412892-1f09-4bcc-8126-bf7d69b8b2d2 for instance with task_state deleting. {{(pid=70954) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 11:00:17 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:00:17 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Instance destroyed successfully. Apr 21 11:00:17 user nova-compute[70954]: DEBUG nova.objects.instance [None req-8a3a11d9-83b1-4471-be02-6c2c73e4ee8b tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Lazy-loading 'resources' on Instance uuid 32a0063e-076e-4585-981e-fe853499aee3 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 11:00:17 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-8a3a11d9-83b1-4471-be02-6c2c73e4ee8b tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:58:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1462598035',display_name='tempest-ServersNegativeTestJSON-server-1462598035',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-1462598035',id=22,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-21T10:58:32Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='acc12d15daf34c5e9d26a6cc53795efe',ramdisk_id='',reservation_id='r-zina7ghx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServersNegativeTestJSON-1265836442',owner_user_name='tempest-ServersNegativeTestJSON-1265836442-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T10:58:33Z,user_data=None,user_id='95956d2e4ea84534b6d5628eb8dd184d',uuid=32a0063e-076e-4585-981e-fe853499aee3,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3e412892-1f09-4bcc-8126-bf7d69b8b2d2", "address": "fa:16:3e:67:64:57", "network": {"id": "3e633eed-7c28-4111-849c-3ab0f46c0c5c", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1483635329-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "acc12d15daf34c5e9d26a6cc53795efe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e412892-1f", "ovs_interfaceid": "3e412892-1f09-4bcc-8126-bf7d69b8b2d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 11:00:17 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-8a3a11d9-83b1-4471-be02-6c2c73e4ee8b tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Converting VIF {"id": "3e412892-1f09-4bcc-8126-bf7d69b8b2d2", "address": "fa:16:3e:67:64:57", "network": {"id": "3e633eed-7c28-4111-849c-3ab0f46c0c5c", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1483635329-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "acc12d15daf34c5e9d26a6cc53795efe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e412892-1f", "ovs_interfaceid": "3e412892-1f09-4bcc-8126-bf7d69b8b2d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 11:00:17 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-8a3a11d9-83b1-4471-be02-6c2c73e4ee8b tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:67:64:57,bridge_name='br-int',has_traffic_filtering=True,id=3e412892-1f09-4bcc-8126-bf7d69b8b2d2,network=Network(3e633eed-7c28-4111-849c-3ab0f46c0c5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e412892-1f') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 11:00:17 user nova-compute[70954]: DEBUG os_vif [None req-8a3a11d9-83b1-4471-be02-6c2c73e4ee8b tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:67:64:57,bridge_name='br-int',has_traffic_filtering=True,id=3e412892-1f09-4bcc-8126-bf7d69b8b2d2,network=Network(3e633eed-7c28-4111-849c-3ab0f46c0c5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e412892-1f') {{(pid=70954) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 11:00:17 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:00:17 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3e412892-1f, bridge=br-int, if_exists=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 11:00:17 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:00:17 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:00:17 user nova-compute[70954]: INFO os_vif [None req-8a3a11d9-83b1-4471-be02-6c2c73e4ee8b tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:67:64:57,bridge_name='br-int',has_traffic_filtering=True,id=3e412892-1f09-4bcc-8126-bf7d69b8b2d2,network=Network(3e633eed-7c28-4111-849c-3ab0f46c0c5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e412892-1f') Apr 21 11:00:17 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-8a3a11d9-83b1-4471-be02-6c2c73e4ee8b tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Deleting instance files /opt/stack/data/nova/instances/32a0063e-076e-4585-981e-fe853499aee3_del Apr 21 11:00:17 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-8a3a11d9-83b1-4471-be02-6c2c73e4ee8b tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Deletion of /opt/stack/data/nova/instances/32a0063e-076e-4585-981e-fe853499aee3_del complete Apr 21 11:00:17 user nova-compute[70954]: INFO nova.compute.manager [None req-8a3a11d9-83b1-4471-be02-6c2c73e4ee8b tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Took 0.66 seconds to destroy the instance on the hypervisor. Apr 21 11:00:17 user nova-compute[70954]: DEBUG oslo.service.loopingcall [None req-8a3a11d9-83b1-4471-be02-6c2c73e4ee8b tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70954) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 11:00:17 user nova-compute[70954]: DEBUG nova.compute.manager [-] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Deallocating network for instance {{(pid=70954) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 11:00:17 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: 32a0063e-076e-4585-981e-fe853499aee3] deallocate_for_instance() {{(pid=70954) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 11:00:17 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:00:17 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Updating instance_info_cache with network_info: [] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 11:00:17 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Took 0.51 seconds to deallocate network for instance. Apr 21 11:00:17 user nova-compute[70954]: DEBUG nova.compute.manager [req-4472d241-f50e-4391-9dd1-87509089f3c6 req-8f82a658-cedf-48a1-bb31-94f37b7ec1ad service nova] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Received event network-vif-deleted-3e412892-1f09-4bcc-8126-bf7d69b8b2d2 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 11:00:17 user nova-compute[70954]: INFO nova.compute.manager [req-4472d241-f50e-4391-9dd1-87509089f3c6 req-8f82a658-cedf-48a1-bb31-94f37b7ec1ad service nova] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Neutron deleted interface 3e412892-1f09-4bcc-8126-bf7d69b8b2d2; detaching it from the instance and deleting it from the info cache Apr 21 11:00:17 user nova-compute[70954]: DEBUG nova.network.neutron [req-4472d241-f50e-4391-9dd1-87509089f3c6 req-8f82a658-cedf-48a1-bb31-94f37b7ec1ad service nova] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Updating instance_info_cache with network_info: [] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 11:00:17 user nova-compute[70954]: DEBUG nova.compute.manager [req-4472d241-f50e-4391-9dd1-87509089f3c6 req-8f82a658-cedf-48a1-bb31-94f37b7ec1ad service nova] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Detach interface failed, port_id=3e412892-1f09-4bcc-8126-bf7d69b8b2d2, reason: Instance 32a0063e-076e-4585-981e-fe853499aee3 could not be found. {{(pid=70954) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 21 11:00:18 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-8a3a11d9-83b1-4471-be02-6c2c73e4ee8b tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:00:18 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-8a3a11d9-83b1-4471-be02-6c2c73e4ee8b tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:00:18 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-8a3a11d9-83b1-4471-be02-6c2c73e4ee8b tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 11:00:18 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-8a3a11d9-83b1-4471-be02-6c2c73e4ee8b tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 11:00:18 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-8a3a11d9-83b1-4471-be02-6c2c73e4ee8b tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.209s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:00:18 user nova-compute[70954]: INFO nova.scheduler.client.report [None req-8a3a11d9-83b1-4471-be02-6c2c73e4ee8b tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Deleted allocations for instance 32a0063e-076e-4585-981e-fe853499aee3 Apr 21 11:00:18 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:00:18 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-8a3a11d9-83b1-4471-be02-6c2c73e4ee8b tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Lock "32a0063e-076e-4585-981e-fe853499aee3" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.571s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:00:18 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:00:18 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70954) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 11:00:19 user nova-compute[70954]: DEBUG nova.compute.manager [req-17289ae5-5beb-425a-8254-d2455cf0f57b req-10e42a90-2276-4ad7-af57-a0454b7bffde service nova] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Received event network-vif-plugged-3e412892-1f09-4bcc-8126-bf7d69b8b2d2 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 11:00:19 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-17289ae5-5beb-425a-8254-d2455cf0f57b req-10e42a90-2276-4ad7-af57-a0454b7bffde service nova] Acquiring lock "32a0063e-076e-4585-981e-fe853499aee3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:00:19 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-17289ae5-5beb-425a-8254-d2455cf0f57b req-10e42a90-2276-4ad7-af57-a0454b7bffde service nova] Lock "32a0063e-076e-4585-981e-fe853499aee3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:00:19 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-17289ae5-5beb-425a-8254-d2455cf0f57b req-10e42a90-2276-4ad7-af57-a0454b7bffde service nova] Lock "32a0063e-076e-4585-981e-fe853499aee3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:00:19 user nova-compute[70954]: DEBUG nova.compute.manager [req-17289ae5-5beb-425a-8254-d2455cf0f57b req-10e42a90-2276-4ad7-af57-a0454b7bffde service nova] [instance: 32a0063e-076e-4585-981e-fe853499aee3] No waiting events found dispatching network-vif-plugged-3e412892-1f09-4bcc-8126-bf7d69b8b2d2 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 11:00:19 user nova-compute[70954]: WARNING nova.compute.manager [req-17289ae5-5beb-425a-8254-d2455cf0f57b req-10e42a90-2276-4ad7-af57-a0454b7bffde service nova] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Received unexpected event network-vif-plugged-3e412892-1f09-4bcc-8126-bf7d69b8b2d2 for instance with vm_state deleted and task_state None. Apr 21 11:00:22 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:00:27 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:00:27 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:00:27 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=70954) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 21 11:00:27 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 11:00:27 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 11:00:27 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:00:30 user nova-compute[70954]: DEBUG nova.compute.manager [req-d85b3460-5850-4cd6-9ad2-81b3643eac57 req-eb6df3bf-7dd7-4465-aba5-bba26fb16d45 service nova] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Received event network-changed-7d9b3006-2edc-475f-8387-86fc52c807f0 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 11:00:30 user nova-compute[70954]: DEBUG nova.compute.manager [req-d85b3460-5850-4cd6-9ad2-81b3643eac57 req-eb6df3bf-7dd7-4465-aba5-bba26fb16d45 service nova] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Refreshing instance network info cache due to event network-changed-7d9b3006-2edc-475f-8387-86fc52c807f0. {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 11:00:30 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-d85b3460-5850-4cd6-9ad2-81b3643eac57 req-eb6df3bf-7dd7-4465-aba5-bba26fb16d45 service nova] Acquiring lock "refresh_cache-332727a2-b516-40e2-9db1-460563e6ebd1" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 11:00:30 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-d85b3460-5850-4cd6-9ad2-81b3643eac57 req-eb6df3bf-7dd7-4465-aba5-bba26fb16d45 service nova] Acquired lock "refresh_cache-332727a2-b516-40e2-9db1-460563e6ebd1" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 11:00:30 user nova-compute[70954]: DEBUG nova.network.neutron [req-d85b3460-5850-4cd6-9ad2-81b3643eac57 req-eb6df3bf-7dd7-4465-aba5-bba26fb16d45 service nova] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Refreshing network info cache for port 7d9b3006-2edc-475f-8387-86fc52c807f0 {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 11:00:31 user nova-compute[70954]: DEBUG nova.network.neutron [req-d85b3460-5850-4cd6-9ad2-81b3643eac57 req-eb6df3bf-7dd7-4465-aba5-bba26fb16d45 service nova] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Updated VIF entry in instance network info cache for port 7d9b3006-2edc-475f-8387-86fc52c807f0. {{(pid=70954) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 11:00:31 user nova-compute[70954]: DEBUG nova.network.neutron [req-d85b3460-5850-4cd6-9ad2-81b3643eac57 req-eb6df3bf-7dd7-4465-aba5-bba26fb16d45 service nova] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Updating instance_info_cache with network_info: [{"id": "7d9b3006-2edc-475f-8387-86fc52c807f0", "address": "fa:16:3e:77:1f:26", "network": {"id": "e0ccd2d9-69df-40e0-be8e-8328039f1bd0", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-587901453-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.253", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "14bc6b0c20204c8287b3523814007856", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d9b3006-2e", "ovs_interfaceid": "7d9b3006-2edc-475f-8387-86fc52c807f0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 11:00:31 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-d85b3460-5850-4cd6-9ad2-81b3643eac57 req-eb6df3bf-7dd7-4465-aba5-bba26fb16d45 service nova] Releasing lock "refresh_cache-332727a2-b516-40e2-9db1-460563e6ebd1" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 11:00:32 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-393ef6db-7039-4d03-bceb-45c239a8f810 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Acquiring lock "332727a2-b516-40e2-9db1-460563e6ebd1" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:00:32 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-393ef6db-7039-4d03-bceb-45c239a8f810 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "332727a2-b516-40e2-9db1-460563e6ebd1" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:00:32 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-393ef6db-7039-4d03-bceb-45c239a8f810 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Acquiring lock "332727a2-b516-40e2-9db1-460563e6ebd1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:00:32 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-393ef6db-7039-4d03-bceb-45c239a8f810 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "332727a2-b516-40e2-9db1-460563e6ebd1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:00:32 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-393ef6db-7039-4d03-bceb-45c239a8f810 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "332727a2-b516-40e2-9db1-460563e6ebd1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:00:32 user nova-compute[70954]: INFO nova.compute.manager [None req-393ef6db-7039-4d03-bceb-45c239a8f810 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Terminating instance Apr 21 11:00:32 user nova-compute[70954]: DEBUG nova.compute.manager [None req-393ef6db-7039-4d03-bceb-45c239a8f810 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Start destroying the instance on the hypervisor. {{(pid=70954) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 11:00:32 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:00:32 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:00:32 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:00:32 user nova-compute[70954]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 11:00:32 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: 32a0063e-076e-4585-981e-fe853499aee3] VM Stopped (Lifecycle Event) Apr 21 11:00:32 user nova-compute[70954]: DEBUG nova.compute.manager [None req-558b176a-a9d2-48b2-9707-3b8cdd4036ce None None] [instance: 32a0063e-076e-4585-981e-fe853499aee3] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 11:00:32 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:00:32 user nova-compute[70954]: DEBUG nova.compute.manager [req-3a3ebdbb-b63f-4f45-abbb-d986f4fa97b1 req-8e402c1d-fc9e-42b6-884b-179cb76f95c6 service nova] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Received event network-vif-unplugged-7d9b3006-2edc-475f-8387-86fc52c807f0 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 11:00:32 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-3a3ebdbb-b63f-4f45-abbb-d986f4fa97b1 req-8e402c1d-fc9e-42b6-884b-179cb76f95c6 service nova] Acquiring lock "332727a2-b516-40e2-9db1-460563e6ebd1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:00:32 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-3a3ebdbb-b63f-4f45-abbb-d986f4fa97b1 req-8e402c1d-fc9e-42b6-884b-179cb76f95c6 service nova] Lock "332727a2-b516-40e2-9db1-460563e6ebd1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:00:32 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-3a3ebdbb-b63f-4f45-abbb-d986f4fa97b1 req-8e402c1d-fc9e-42b6-884b-179cb76f95c6 service nova] Lock "332727a2-b516-40e2-9db1-460563e6ebd1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:00:32 user nova-compute[70954]: DEBUG nova.compute.manager [req-3a3ebdbb-b63f-4f45-abbb-d986f4fa97b1 req-8e402c1d-fc9e-42b6-884b-179cb76f95c6 service nova] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] No waiting events found dispatching network-vif-unplugged-7d9b3006-2edc-475f-8387-86fc52c807f0 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 11:00:32 user nova-compute[70954]: DEBUG nova.compute.manager [req-3a3ebdbb-b63f-4f45-abbb-d986f4fa97b1 req-8e402c1d-fc9e-42b6-884b-179cb76f95c6 service nova] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Received event network-vif-unplugged-7d9b3006-2edc-475f-8387-86fc52c807f0 for instance with task_state deleting. {{(pid=70954) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 11:00:32 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Instance destroyed successfully. Apr 21 11:00:32 user nova-compute[70954]: DEBUG nova.objects.instance [None req-393ef6db-7039-4d03-bceb-45c239a8f810 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lazy-loading 'resources' on Instance uuid 332727a2-b516-40e2-9db1-460563e6ebd1 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 11:00:32 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-393ef6db-7039-4d03-bceb-45c239a8f810 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:58:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-117564941',display_name='tempest-AttachVolumeNegativeTest-server-117564941',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-117564941',id=23,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFO8Apx3ELF1FQJTJmhq4XG7YL/DAo8z97ik4gGkKvut8Sus+PkfBnxQbqnCnfruRlDoOqDGvT630ViBsjv9qzGHLWN6zYB2m0AN9jyGLG1T5nwr0xbvXMEYFBE9kKO5cg==',key_name='tempest-keypair-2123760870',keypairs=,launch_index=0,launched_at=2023-04-21T10:58:45Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='14bc6b0c20204c8287b3523814007856',ramdisk_id='',reservation_id='r-fed6e10h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeNegativeTest-159654333',owner_user_name='tempest-AttachVolumeNegativeTest-159654333-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T10:58:46Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d7fc66871488428e9842404d885bcfe3',uuid=332727a2-b516-40e2-9db1-460563e6ebd1,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7d9b3006-2edc-475f-8387-86fc52c807f0", "address": "fa:16:3e:77:1f:26", "network": {"id": "e0ccd2d9-69df-40e0-be8e-8328039f1bd0", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-587901453-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.253", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "14bc6b0c20204c8287b3523814007856", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d9b3006-2e", "ovs_interfaceid": "7d9b3006-2edc-475f-8387-86fc52c807f0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 11:00:32 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-393ef6db-7039-4d03-bceb-45c239a8f810 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Converting VIF {"id": "7d9b3006-2edc-475f-8387-86fc52c807f0", "address": "fa:16:3e:77:1f:26", "network": {"id": "e0ccd2d9-69df-40e0-be8e-8328039f1bd0", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-587901453-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.253", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "14bc6b0c20204c8287b3523814007856", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d9b3006-2e", "ovs_interfaceid": "7d9b3006-2edc-475f-8387-86fc52c807f0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 11:00:32 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-393ef6db-7039-4d03-bceb-45c239a8f810 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:77:1f:26,bridge_name='br-int',has_traffic_filtering=True,id=7d9b3006-2edc-475f-8387-86fc52c807f0,network=Network(e0ccd2d9-69df-40e0-be8e-8328039f1bd0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7d9b3006-2e') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 11:00:32 user nova-compute[70954]: DEBUG os_vif [None req-393ef6db-7039-4d03-bceb-45c239a8f810 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:77:1f:26,bridge_name='br-int',has_traffic_filtering=True,id=7d9b3006-2edc-475f-8387-86fc52c807f0,network=Network(e0ccd2d9-69df-40e0-be8e-8328039f1bd0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7d9b3006-2e') {{(pid=70954) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 11:00:32 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:00:32 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7d9b3006-2e, bridge=br-int, if_exists=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 11:00:32 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:00:32 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:00:32 user nova-compute[70954]: INFO os_vif [None req-393ef6db-7039-4d03-bceb-45c239a8f810 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:77:1f:26,bridge_name='br-int',has_traffic_filtering=True,id=7d9b3006-2edc-475f-8387-86fc52c807f0,network=Network(e0ccd2d9-69df-40e0-be8e-8328039f1bd0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7d9b3006-2e') Apr 21 11:00:32 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-393ef6db-7039-4d03-bceb-45c239a8f810 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Deleting instance files /opt/stack/data/nova/instances/332727a2-b516-40e2-9db1-460563e6ebd1_del Apr 21 11:00:32 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-393ef6db-7039-4d03-bceb-45c239a8f810 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Deletion of /opt/stack/data/nova/instances/332727a2-b516-40e2-9db1-460563e6ebd1_del complete Apr 21 11:00:32 user nova-compute[70954]: INFO nova.compute.manager [None req-393ef6db-7039-4d03-bceb-45c239a8f810 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Took 0.65 seconds to destroy the instance on the hypervisor. Apr 21 11:00:32 user nova-compute[70954]: DEBUG oslo.service.loopingcall [None req-393ef6db-7039-4d03-bceb-45c239a8f810 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70954) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 11:00:32 user nova-compute[70954]: DEBUG nova.compute.manager [-] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Deallocating network for instance {{(pid=70954) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 11:00:32 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] deallocate_for_instance() {{(pid=70954) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 11:00:33 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:00:33 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Updating instance_info_cache with network_info: [] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 11:00:33 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Took 0.90 seconds to deallocate network for instance. Apr 21 11:00:33 user nova-compute[70954]: DEBUG nova.compute.manager [req-15ca6521-27e9-4bb0-b5c0-a58e15c10836 req-6a12c94c-1f35-40b4-a0d1-f62bd8e4a865 service nova] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Received event network-vif-deleted-7d9b3006-2edc-475f-8387-86fc52c807f0 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 11:00:33 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-393ef6db-7039-4d03-bceb-45c239a8f810 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:00:33 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-393ef6db-7039-4d03-bceb-45c239a8f810 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:00:33 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-393ef6db-7039-4d03-bceb-45c239a8f810 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 11:00:33 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-393ef6db-7039-4d03-bceb-45c239a8f810 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 11:00:33 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-393ef6db-7039-4d03-bceb-45c239a8f810 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.165s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:00:33 user nova-compute[70954]: INFO nova.scheduler.client.report [None req-393ef6db-7039-4d03-bceb-45c239a8f810 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Deleted allocations for instance 332727a2-b516-40e2-9db1-460563e6ebd1 Apr 21 11:00:33 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-393ef6db-7039-4d03-bceb-45c239a8f810 tempest-AttachVolumeNegativeTest-159654333 tempest-AttachVolumeNegativeTest-159654333-project-member] Lock "332727a2-b516-40e2-9db1-460563e6ebd1" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.880s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:00:34 user nova-compute[70954]: DEBUG nova.compute.manager [req-dd6639f3-fdf6-4f89-95bb-c88a6a7ae02b req-4daec593-8ca5-4f70-a544-5f76cb953918 service nova] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Received event network-vif-plugged-7d9b3006-2edc-475f-8387-86fc52c807f0 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 11:00:34 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-dd6639f3-fdf6-4f89-95bb-c88a6a7ae02b req-4daec593-8ca5-4f70-a544-5f76cb953918 service nova] Acquiring lock "332727a2-b516-40e2-9db1-460563e6ebd1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:00:34 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-dd6639f3-fdf6-4f89-95bb-c88a6a7ae02b req-4daec593-8ca5-4f70-a544-5f76cb953918 service nova] Lock "332727a2-b516-40e2-9db1-460563e6ebd1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:00:34 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-dd6639f3-fdf6-4f89-95bb-c88a6a7ae02b req-4daec593-8ca5-4f70-a544-5f76cb953918 service nova] Lock "332727a2-b516-40e2-9db1-460563e6ebd1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:00:34 user nova-compute[70954]: DEBUG nova.compute.manager [req-dd6639f3-fdf6-4f89-95bb-c88a6a7ae02b req-4daec593-8ca5-4f70-a544-5f76cb953918 service nova] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] No waiting events found dispatching network-vif-plugged-7d9b3006-2edc-475f-8387-86fc52c807f0 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 11:00:34 user nova-compute[70954]: WARNING nova.compute.manager [req-dd6639f3-fdf6-4f89-95bb-c88a6a7ae02b req-4daec593-8ca5-4f70-a544-5f76cb953918 service nova] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Received unexpected event network-vif-plugged-7d9b3006-2edc-475f-8387-86fc52c807f0 for instance with vm_state deleted and task_state None. Apr 21 11:00:37 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:00:42 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:00:42 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:00:42 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=70954) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 21 11:00:42 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 11:00:42 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 11:00:42 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:00:47 user nova-compute[70954]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 11:00:47 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] VM Stopped (Lifecycle Event) Apr 21 11:00:47 user nova-compute[70954]: DEBUG nova.compute.manager [None req-243d3401-a526-4ab2-a6fe-ecf7f07fc734 None None] [instance: 332727a2-b516-40e2-9db1-460563e6ebd1] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 11:00:47 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:00:52 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:00:53 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:00:57 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:01:02 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:01:07 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:01:08 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:01:08 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Starting heal instance info cache {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 11:01:08 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Rebuilding the list of instances to heal {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 21 11:01:08 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "refresh_cache-595d41a4-9a01-4aa2-96a1-c2c763475184" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 11:01:08 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquired lock "refresh_cache-595d41a4-9a01-4aa2-96a1-c2c763475184" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 11:01:08 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Forcefully refreshing network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 21 11:01:08 user nova-compute[70954]: DEBUG nova.objects.instance [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lazy-loading 'info_cache' on Instance uuid 595d41a4-9a01-4aa2-96a1-c2c763475184 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 11:01:09 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Updating instance_info_cache with network_info: [{"id": "eb0b0125-965b-4825-aab1-3ba81be44c2f", "address": "fa:16:3e:13:39:c1", "network": {"id": "3e633eed-7c28-4111-849c-3ab0f46c0c5c", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1483635329-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "acc12d15daf34c5e9d26a6cc53795efe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb0b0125-96", "ovs_interfaceid": "eb0b0125-965b-4825-aab1-3ba81be44c2f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 11:01:09 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Releasing lock "refresh_cache-595d41a4-9a01-4aa2-96a1-c2c763475184" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 11:01:09 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Updated the network info_cache for instance {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 21 11:01:09 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:01:09 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager.update_available_resource {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:01:09 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:01:09 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:01:09 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:01:09 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Auditing locally available compute resources for user (node: user) {{(pid=70954) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 11:01:09 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00/disk.rescue --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:01:10 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00/disk.rescue --force-share --output=json" returned: 0 in 0.133s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:01:10 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00/disk.rescue --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:01:10 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00/disk.rescue --force-share --output=json" returned: 0 in 0.125s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:01:10 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:01:10 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00/disk --force-share --output=json" returned: 0 in 0.126s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:01:10 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:01:10 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00/disk --force-share --output=json" returned: 0 in 0.127s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:01:10 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:01:10 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:01:10 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:01:10 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:01:10 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/476dbf2e-b02a-47bc-a8c6-6d0d66d5d433/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:01:10 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/476dbf2e-b02a-47bc-a8c6-6d0d66d5d433/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:01:10 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/476dbf2e-b02a-47bc-a8c6-6d0d66d5d433/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:01:11 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/476dbf2e-b02a-47bc-a8c6-6d0d66d5d433/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:01:11 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 11:01:11 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 11:01:11 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Hypervisor/Node resource view: name=user free_ram=8897MB free_disk=26.489501953125GB free_vcpus=9 pci_devices=[{"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70954) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 11:01:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:01:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:01:11 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 595d41a4-9a01-4aa2-96a1-c2c763475184 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 11:01:11 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 11:01:11 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance d1fca309-1d26-4a34-b932-716064b86b00 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 11:01:11 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Total usable vcpus: 12, total allocated vcpus: 3 {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 11:01:11 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Final resource view: name=user phys_ram=16023MB used_ram=896MB phys_disk=40GB used_disk=3GB total_vcpus=12 used_vcpus=3 pci_stats=[] {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 11:01:11 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 11:01:11 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 11:01:11 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Compute_service record updated for user:user {{(pid=70954) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 11:01:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.272s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:01:12 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:01:13 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:01:13 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:01:13 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:01:14 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:01:15 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:01:17 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:01:18 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:01:19 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:01:19 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70954) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 11:01:22 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:01:22 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:01:26 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:01:27 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:01:28 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:01:32 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:01:37 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:01:38 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:01:42 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:01:47 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:01:52 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:01:55 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-e01d5701-a573-4c29-ac36-49a717a2c77f tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Acquiring lock "d1fca309-1d26-4a34-b932-716064b86b00" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:01:55 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-e01d5701-a573-4c29-ac36-49a717a2c77f tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lock "d1fca309-1d26-4a34-b932-716064b86b00" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:01:55 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-e01d5701-a573-4c29-ac36-49a717a2c77f tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Acquiring lock "d1fca309-1d26-4a34-b932-716064b86b00-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:01:55 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-e01d5701-a573-4c29-ac36-49a717a2c77f tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lock "d1fca309-1d26-4a34-b932-716064b86b00-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:01:55 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-e01d5701-a573-4c29-ac36-49a717a2c77f tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lock "d1fca309-1d26-4a34-b932-716064b86b00-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:01:55 user nova-compute[70954]: INFO nova.compute.manager [None req-e01d5701-a573-4c29-ac36-49a717a2c77f tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Terminating instance Apr 21 11:01:55 user nova-compute[70954]: DEBUG nova.compute.manager [None req-e01d5701-a573-4c29-ac36-49a717a2c77f tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Start destroying the instance on the hypervisor. {{(pid=70954) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 11:01:55 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:01:55 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:01:55 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:01:55 user nova-compute[70954]: DEBUG nova.compute.manager [req-b831a792-1a2f-40ec-abcc-30aea9a56e5a req-52e80f05-e9c5-4634-86d8-6c252833fb14 service nova] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Received event network-vif-unplugged-8abc9260-fa02-4915-a056-63262b57e3be {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 11:01:55 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-b831a792-1a2f-40ec-abcc-30aea9a56e5a req-52e80f05-e9c5-4634-86d8-6c252833fb14 service nova] Acquiring lock "d1fca309-1d26-4a34-b932-716064b86b00-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:01:55 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-b831a792-1a2f-40ec-abcc-30aea9a56e5a req-52e80f05-e9c5-4634-86d8-6c252833fb14 service nova] Lock "d1fca309-1d26-4a34-b932-716064b86b00-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:01:55 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-b831a792-1a2f-40ec-abcc-30aea9a56e5a req-52e80f05-e9c5-4634-86d8-6c252833fb14 service nova] Lock "d1fca309-1d26-4a34-b932-716064b86b00-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:01:55 user nova-compute[70954]: DEBUG nova.compute.manager [req-b831a792-1a2f-40ec-abcc-30aea9a56e5a req-52e80f05-e9c5-4634-86d8-6c252833fb14 service nova] [instance: d1fca309-1d26-4a34-b932-716064b86b00] No waiting events found dispatching network-vif-unplugged-8abc9260-fa02-4915-a056-63262b57e3be {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 11:01:55 user nova-compute[70954]: DEBUG nova.compute.manager [req-b831a792-1a2f-40ec-abcc-30aea9a56e5a req-52e80f05-e9c5-4634-86d8-6c252833fb14 service nova] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Received event network-vif-unplugged-8abc9260-fa02-4915-a056-63262b57e3be for instance with task_state deleting. {{(pid=70954) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 11:01:55 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Instance destroyed successfully. Apr 21 11:01:55 user nova-compute[70954]: DEBUG nova.objects.instance [None req-e01d5701-a573-4c29-ac36-49a717a2c77f tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lazy-loading 'resources' on Instance uuid d1fca309-1d26-4a34-b932-716064b86b00 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 11:01:55 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-e01d5701-a573-4c29-ac36-49a717a2c77f tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:57:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-6822012',display_name='tempest-ServerRescueNegativeTestJSON-server-6822012',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-6822012',id=21,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-21T10:59:06Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='c0a611b8a8d54522929c37807054b2f6',ramdisk_id='',reservation_id='r-cfw0zm0m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerRescueNegativeTestJSON-1656706265',owner_user_name='tempest-ServerRescueNegativeTestJSON-1656706265-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T10:59:06Z,user_data=None,user_id='0f73ac02062c4411bde0c97f6a719926',uuid=d1fca309-1d26-4a34-b932-716064b86b00,vcpu_model=,vcpus=1,vm_mode=None,vm_state='rescued') vif={"id": "8abc9260-fa02-4915-a056-63262b57e3be", "address": "fa:16:3e:8d:97:64", "network": {"id": "72777f52-fe61-4f05-b2c6-5edb74fb3138", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1022240465-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c0a611b8a8d54522929c37807054b2f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8abc9260-fa", "ovs_interfaceid": "8abc9260-fa02-4915-a056-63262b57e3be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 11:01:55 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-e01d5701-a573-4c29-ac36-49a717a2c77f tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Converting VIF {"id": "8abc9260-fa02-4915-a056-63262b57e3be", "address": "fa:16:3e:8d:97:64", "network": {"id": "72777f52-fe61-4f05-b2c6-5edb74fb3138", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1022240465-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c0a611b8a8d54522929c37807054b2f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8abc9260-fa", "ovs_interfaceid": "8abc9260-fa02-4915-a056-63262b57e3be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 11:01:55 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-e01d5701-a573-4c29-ac36-49a717a2c77f tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8d:97:64,bridge_name='br-int',has_traffic_filtering=True,id=8abc9260-fa02-4915-a056-63262b57e3be,network=Network(72777f52-fe61-4f05-b2c6-5edb74fb3138),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8abc9260-fa') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 11:01:55 user nova-compute[70954]: DEBUG os_vif [None req-e01d5701-a573-4c29-ac36-49a717a2c77f tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8d:97:64,bridge_name='br-int',has_traffic_filtering=True,id=8abc9260-fa02-4915-a056-63262b57e3be,network=Network(72777f52-fe61-4f05-b2c6-5edb74fb3138),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8abc9260-fa') {{(pid=70954) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 11:01:55 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:01:55 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8abc9260-fa, bridge=br-int, if_exists=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 11:01:55 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:01:55 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:01:55 user nova-compute[70954]: INFO os_vif [None req-e01d5701-a573-4c29-ac36-49a717a2c77f tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8d:97:64,bridge_name='br-int',has_traffic_filtering=True,id=8abc9260-fa02-4915-a056-63262b57e3be,network=Network(72777f52-fe61-4f05-b2c6-5edb74fb3138),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8abc9260-fa') Apr 21 11:01:55 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-e01d5701-a573-4c29-ac36-49a717a2c77f tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Deleting instance files /opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00_del Apr 21 11:01:55 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-e01d5701-a573-4c29-ac36-49a717a2c77f tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Deletion of /opt/stack/data/nova/instances/d1fca309-1d26-4a34-b932-716064b86b00_del complete Apr 21 11:01:55 user nova-compute[70954]: INFO nova.compute.manager [None req-e01d5701-a573-4c29-ac36-49a717a2c77f tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Took 0.67 seconds to destroy the instance on the hypervisor. Apr 21 11:01:55 user nova-compute[70954]: DEBUG oslo.service.loopingcall [None req-e01d5701-a573-4c29-ac36-49a717a2c77f tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70954) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 11:01:55 user nova-compute[70954]: DEBUG nova.compute.manager [-] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Deallocating network for instance {{(pid=70954) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 11:01:55 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: d1fca309-1d26-4a34-b932-716064b86b00] deallocate_for_instance() {{(pid=70954) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 11:01:56 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Updating instance_info_cache with network_info: [] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 11:01:56 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Took 0.56 seconds to deallocate network for instance. Apr 21 11:01:56 user nova-compute[70954]: DEBUG nova.compute.manager [req-69d30f2e-4407-49c6-9884-0f8c434eccdd req-b72b0d30-d477-4e92-b45d-0c910f220401 service nova] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Received event network-vif-deleted-8abc9260-fa02-4915-a056-63262b57e3be {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 11:01:56 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-e01d5701-a573-4c29-ac36-49a717a2c77f tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:01:56 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-e01d5701-a573-4c29-ac36-49a717a2c77f tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:01:56 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-e01d5701-a573-4c29-ac36-49a717a2c77f tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 11:01:56 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-e01d5701-a573-4c29-ac36-49a717a2c77f tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 11:01:56 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-e01d5701-a573-4c29-ac36-49a717a2c77f tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.166s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:01:56 user nova-compute[70954]: INFO nova.scheduler.client.report [None req-e01d5701-a573-4c29-ac36-49a717a2c77f tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Deleted allocations for instance d1fca309-1d26-4a34-b932-716064b86b00 Apr 21 11:01:56 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-e01d5701-a573-4c29-ac36-49a717a2c77f tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lock "d1fca309-1d26-4a34-b932-716064b86b00" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.643s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:01:57 user nova-compute[70954]: DEBUG nova.compute.manager [req-be21e5bd-efb4-4b96-ac77-17ac6e317609 req-b7df87e7-e500-4080-a418-4b228711bae1 service nova] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Received event network-vif-plugged-8abc9260-fa02-4915-a056-63262b57e3be {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 11:01:57 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-be21e5bd-efb4-4b96-ac77-17ac6e317609 req-b7df87e7-e500-4080-a418-4b228711bae1 service nova] Acquiring lock "d1fca309-1d26-4a34-b932-716064b86b00-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:01:57 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-be21e5bd-efb4-4b96-ac77-17ac6e317609 req-b7df87e7-e500-4080-a418-4b228711bae1 service nova] Lock "d1fca309-1d26-4a34-b932-716064b86b00-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:01:57 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-be21e5bd-efb4-4b96-ac77-17ac6e317609 req-b7df87e7-e500-4080-a418-4b228711bae1 service nova] Lock "d1fca309-1d26-4a34-b932-716064b86b00-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:01:57 user nova-compute[70954]: DEBUG nova.compute.manager [req-be21e5bd-efb4-4b96-ac77-17ac6e317609 req-b7df87e7-e500-4080-a418-4b228711bae1 service nova] [instance: d1fca309-1d26-4a34-b932-716064b86b00] No waiting events found dispatching network-vif-plugged-8abc9260-fa02-4915-a056-63262b57e3be {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 11:01:57 user nova-compute[70954]: WARNING nova.compute.manager [req-be21e5bd-efb4-4b96-ac77-17ac6e317609 req-b7df87e7-e500-4080-a418-4b228711bae1 service nova] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Received unexpected event network-vif-plugged-8abc9260-fa02-4915-a056-63262b57e3be for instance with vm_state deleted and task_state None. Apr 21 11:02:00 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:02:05 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:02:08 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:02:08 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Starting heal instance info cache {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 11:02:08 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "refresh_cache-476dbf2e-b02a-47bc-a8c6-6d0d66d5d433" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 11:02:08 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquired lock "refresh_cache-476dbf2e-b02a-47bc-a8c6-6d0d66d5d433" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 11:02:08 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Forcefully refreshing network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 21 11:02:09 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Updating instance_info_cache with network_info: [{"id": "fb82372c-8c1a-43e8-9eba-5f5469b8ac66", "address": "fa:16:3e:d7:18:9c", "network": {"id": "72777f52-fe61-4f05-b2c6-5edb74fb3138", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1022240465-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c0a611b8a8d54522929c37807054b2f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb82372c-8c", "ovs_interfaceid": "fb82372c-8c1a-43e8-9eba-5f5469b8ac66", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 11:02:09 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Releasing lock "refresh_cache-476dbf2e-b02a-47bc-a8c6-6d0d66d5d433" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 11:02:09 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Updated the network info_cache for instance {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 21 11:02:10 user nova-compute[70954]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 11:02:10 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: d1fca309-1d26-4a34-b932-716064b86b00] VM Stopped (Lifecycle Event) Apr 21 11:02:10 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:02:10 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:02:10 user nova-compute[70954]: DEBUG nova.compute.manager [None req-b306fe56-5066-4010-b38b-dcfd8275a9ec None None] [instance: d1fca309-1d26-4a34-b932-716064b86b00] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 11:02:11 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:02:11 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager.update_available_resource {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:02:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:02:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:02:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:02:11 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Auditing locally available compute resources for user (node: user) {{(pid=70954) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 11:02:11 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:02:12 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:02:12 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:02:12 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:02:12 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/476dbf2e-b02a-47bc-a8c6-6d0d66d5d433/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:02:12 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/476dbf2e-b02a-47bc-a8c6-6d0d66d5d433/disk --force-share --output=json" returned: 0 in 0.147s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:02:12 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/476dbf2e-b02a-47bc-a8c6-6d0d66d5d433/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:02:12 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/476dbf2e-b02a-47bc-a8c6-6d0d66d5d433/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:02:13 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 11:02:13 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 11:02:13 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Hypervisor/Node resource view: name=user free_ram=9011MB free_disk=26.517498016357422GB free_vcpus=10 pci_devices=[{"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70954) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 11:02:13 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:02:13 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:02:13 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 595d41a4-9a01-4aa2-96a1-c2c763475184 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 11:02:13 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 11:02:13 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Total usable vcpus: 12, total allocated vcpus: 2 {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 11:02:13 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Final resource view: name=user phys_ram=16023MB used_ram=768MB phys_disk=40GB used_disk=2GB total_vcpus=12 used_vcpus=2 pci_stats=[] {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 11:02:13 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 11:02:13 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 11:02:13 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Compute_service record updated for user:user {{(pid=70954) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 11:02:13 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.247s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:02:14 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:02:14 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:02:14 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:02:15 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:02:16 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:02:20 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:02:21 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:02:21 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70954) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 11:02:25 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:02:30 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:02:35 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:02:40 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:02:45 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:02:46 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-aadd679c-5533-4a72-a66f-c77e34441ebe tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Acquiring lock "476dbf2e-b02a-47bc-a8c6-6d0d66d5d433" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:02:46 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-aadd679c-5533-4a72-a66f-c77e34441ebe tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lock "476dbf2e-b02a-47bc-a8c6-6d0d66d5d433" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:02:46 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-aadd679c-5533-4a72-a66f-c77e34441ebe tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Acquiring lock "476dbf2e-b02a-47bc-a8c6-6d0d66d5d433-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:02:46 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-aadd679c-5533-4a72-a66f-c77e34441ebe tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lock "476dbf2e-b02a-47bc-a8c6-6d0d66d5d433-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:02:46 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-aadd679c-5533-4a72-a66f-c77e34441ebe tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lock "476dbf2e-b02a-47bc-a8c6-6d0d66d5d433-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:02:46 user nova-compute[70954]: INFO nova.compute.manager [None req-aadd679c-5533-4a72-a66f-c77e34441ebe tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Terminating instance Apr 21 11:02:46 user nova-compute[70954]: DEBUG nova.compute.manager [None req-aadd679c-5533-4a72-a66f-c77e34441ebe tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Start destroying the instance on the hypervisor. {{(pid=70954) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 11:02:46 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:02:46 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:02:46 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:02:46 user nova-compute[70954]: DEBUG nova.compute.manager [req-3d2399fb-7db1-46b0-ac54-a190bb72a6c4 req-dd910c27-40d1-44cc-ab4b-0a6eccbc6ba2 service nova] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Received event network-vif-unplugged-fb82372c-8c1a-43e8-9eba-5f5469b8ac66 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 11:02:46 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-3d2399fb-7db1-46b0-ac54-a190bb72a6c4 req-dd910c27-40d1-44cc-ab4b-0a6eccbc6ba2 service nova] Acquiring lock "476dbf2e-b02a-47bc-a8c6-6d0d66d5d433-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:02:46 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-3d2399fb-7db1-46b0-ac54-a190bb72a6c4 req-dd910c27-40d1-44cc-ab4b-0a6eccbc6ba2 service nova] Lock "476dbf2e-b02a-47bc-a8c6-6d0d66d5d433-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:02:46 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-3d2399fb-7db1-46b0-ac54-a190bb72a6c4 req-dd910c27-40d1-44cc-ab4b-0a6eccbc6ba2 service nova] Lock "476dbf2e-b02a-47bc-a8c6-6d0d66d5d433-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:02:46 user nova-compute[70954]: DEBUG nova.compute.manager [req-3d2399fb-7db1-46b0-ac54-a190bb72a6c4 req-dd910c27-40d1-44cc-ab4b-0a6eccbc6ba2 service nova] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] No waiting events found dispatching network-vif-unplugged-fb82372c-8c1a-43e8-9eba-5f5469b8ac66 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 11:02:46 user nova-compute[70954]: DEBUG nova.compute.manager [req-3d2399fb-7db1-46b0-ac54-a190bb72a6c4 req-dd910c27-40d1-44cc-ab4b-0a6eccbc6ba2 service nova] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Received event network-vif-unplugged-fb82372c-8c1a-43e8-9eba-5f5469b8ac66 for instance with task_state deleting. {{(pid=70954) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 11:02:46 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Instance destroyed successfully. Apr 21 11:02:46 user nova-compute[70954]: DEBUG nova.objects.instance [None req-aadd679c-5533-4a72-a66f-c77e34441ebe tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lazy-loading 'resources' on Instance uuid 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 11:02:46 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-aadd679c-5533-4a72-a66f-c77e34441ebe tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:57:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1050316627',display_name='tempest-ServerRescueNegativeTestJSON-server-1050316627',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-1050316627',id=20,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-21T10:57:18Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='c0a611b8a8d54522929c37807054b2f6',ramdisk_id='',reservation_id='r-i4m218s0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerRescueNegativeTestJSON-1656706265',owner_user_name='tempest-ServerRescueNegativeTestJSON-1656706265-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T10:57:18Z,user_data=None,user_id='0f73ac02062c4411bde0c97f6a719926',uuid=476dbf2e-b02a-47bc-a8c6-6d0d66d5d433,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fb82372c-8c1a-43e8-9eba-5f5469b8ac66", "address": "fa:16:3e:d7:18:9c", "network": {"id": "72777f52-fe61-4f05-b2c6-5edb74fb3138", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1022240465-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c0a611b8a8d54522929c37807054b2f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb82372c-8c", "ovs_interfaceid": "fb82372c-8c1a-43e8-9eba-5f5469b8ac66", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 11:02:46 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-aadd679c-5533-4a72-a66f-c77e34441ebe tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Converting VIF {"id": "fb82372c-8c1a-43e8-9eba-5f5469b8ac66", "address": "fa:16:3e:d7:18:9c", "network": {"id": "72777f52-fe61-4f05-b2c6-5edb74fb3138", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1022240465-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c0a611b8a8d54522929c37807054b2f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb82372c-8c", "ovs_interfaceid": "fb82372c-8c1a-43e8-9eba-5f5469b8ac66", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 11:02:46 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-aadd679c-5533-4a72-a66f-c77e34441ebe tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d7:18:9c,bridge_name='br-int',has_traffic_filtering=True,id=fb82372c-8c1a-43e8-9eba-5f5469b8ac66,network=Network(72777f52-fe61-4f05-b2c6-5edb74fb3138),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb82372c-8c') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 11:02:46 user nova-compute[70954]: DEBUG os_vif [None req-aadd679c-5533-4a72-a66f-c77e34441ebe tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d7:18:9c,bridge_name='br-int',has_traffic_filtering=True,id=fb82372c-8c1a-43e8-9eba-5f5469b8ac66,network=Network(72777f52-fe61-4f05-b2c6-5edb74fb3138),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb82372c-8c') {{(pid=70954) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 11:02:46 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:02:46 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfb82372c-8c, bridge=br-int, if_exists=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 11:02:46 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:02:46 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:02:46 user nova-compute[70954]: INFO os_vif [None req-aadd679c-5533-4a72-a66f-c77e34441ebe tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d7:18:9c,bridge_name='br-int',has_traffic_filtering=True,id=fb82372c-8c1a-43e8-9eba-5f5469b8ac66,network=Network(72777f52-fe61-4f05-b2c6-5edb74fb3138),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb82372c-8c') Apr 21 11:02:46 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-aadd679c-5533-4a72-a66f-c77e34441ebe tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Deleting instance files /opt/stack/data/nova/instances/476dbf2e-b02a-47bc-a8c6-6d0d66d5d433_del Apr 21 11:02:46 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-aadd679c-5533-4a72-a66f-c77e34441ebe tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Deletion of /opt/stack/data/nova/instances/476dbf2e-b02a-47bc-a8c6-6d0d66d5d433_del complete Apr 21 11:02:46 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:02:46 user nova-compute[70954]: INFO nova.compute.manager [None req-aadd679c-5533-4a72-a66f-c77e34441ebe tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Took 0.66 seconds to destroy the instance on the hypervisor. Apr 21 11:02:46 user nova-compute[70954]: DEBUG oslo.service.loopingcall [None req-aadd679c-5533-4a72-a66f-c77e34441ebe tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70954) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 11:02:46 user nova-compute[70954]: DEBUG nova.compute.manager [-] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Deallocating network for instance {{(pid=70954) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 11:02:46 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] deallocate_for_instance() {{(pid=70954) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 11:02:47 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Updating instance_info_cache with network_info: [] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 11:02:47 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Took 0.54 seconds to deallocate network for instance. Apr 21 11:02:47 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-aadd679c-5533-4a72-a66f-c77e34441ebe tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:02:47 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-aadd679c-5533-4a72-a66f-c77e34441ebe tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:02:47 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-aadd679c-5533-4a72-a66f-c77e34441ebe tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 11:02:47 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-aadd679c-5533-4a72-a66f-c77e34441ebe tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 11:02:47 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-aadd679c-5533-4a72-a66f-c77e34441ebe tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.146s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:02:47 user nova-compute[70954]: INFO nova.scheduler.client.report [None req-aadd679c-5533-4a72-a66f-c77e34441ebe tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Deleted allocations for instance 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433 Apr 21 11:02:47 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-aadd679c-5533-4a72-a66f-c77e34441ebe tempest-ServerRescueNegativeTestJSON-1656706265 tempest-ServerRescueNegativeTestJSON-1656706265-project-member] Lock "476dbf2e-b02a-47bc-a8c6-6d0d66d5d433" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.560s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:02:48 user nova-compute[70954]: DEBUG nova.compute.manager [req-8b14407e-2b6a-4446-ad91-7e12f60136c9 req-d65e49f9-430b-4c0d-ae82-cf615311ca7e service nova] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Received event network-vif-plugged-fb82372c-8c1a-43e8-9eba-5f5469b8ac66 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 11:02:48 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-8b14407e-2b6a-4446-ad91-7e12f60136c9 req-d65e49f9-430b-4c0d-ae82-cf615311ca7e service nova] Acquiring lock "476dbf2e-b02a-47bc-a8c6-6d0d66d5d433-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:02:48 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-8b14407e-2b6a-4446-ad91-7e12f60136c9 req-d65e49f9-430b-4c0d-ae82-cf615311ca7e service nova] Lock "476dbf2e-b02a-47bc-a8c6-6d0d66d5d433-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:02:48 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-8b14407e-2b6a-4446-ad91-7e12f60136c9 req-d65e49f9-430b-4c0d-ae82-cf615311ca7e service nova] Lock "476dbf2e-b02a-47bc-a8c6-6d0d66d5d433-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:02:48 user nova-compute[70954]: DEBUG nova.compute.manager [req-8b14407e-2b6a-4446-ad91-7e12f60136c9 req-d65e49f9-430b-4c0d-ae82-cf615311ca7e service nova] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] No waiting events found dispatching network-vif-plugged-fb82372c-8c1a-43e8-9eba-5f5469b8ac66 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 11:02:48 user nova-compute[70954]: WARNING nova.compute.manager [req-8b14407e-2b6a-4446-ad91-7e12f60136c9 req-d65e49f9-430b-4c0d-ae82-cf615311ca7e service nova] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Received unexpected event network-vif-plugged-fb82372c-8c1a-43e8-9eba-5f5469b8ac66 for instance with vm_state deleted and task_state None. Apr 21 11:02:48 user nova-compute[70954]: DEBUG nova.compute.manager [req-8b14407e-2b6a-4446-ad91-7e12f60136c9 req-d65e49f9-430b-4c0d-ae82-cf615311ca7e service nova] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Received event network-vif-deleted-fb82372c-8c1a-43e8-9eba-5f5469b8ac66 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 11:02:51 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:02:56 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:03:01 user nova-compute[70954]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 11:03:01 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] VM Stopped (Lifecycle Event) Apr 21 11:03:01 user nova-compute[70954]: DEBUG nova.compute.manager [None req-f3d900bc-9301-423a-9521-9b1856902fd0 None None] [instance: 476dbf2e-b02a-47bc-a8c6-6d0d66d5d433] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 11:03:01 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:03:06 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:03:09 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:03:09 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Starting heal instance info cache {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 11:03:09 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Didn't find any instances for network info cache update. {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 21 11:03:09 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:03:09 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Cleaning up deleted instances with incomplete migration {{(pid=70954) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 21 11:03:11 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:03:11 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:03:11 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:03:11 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager.update_available_resource {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:03:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:03:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:03:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:03:11 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Auditing locally available compute resources for user (node: user) {{(pid=70954) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 11:03:12 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:03:12 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184/disk --force-share --output=json" returned: 0 in 0.146s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:03:12 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:03:12 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:03:12 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 11:03:12 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 11:03:12 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Hypervisor/Node resource view: name=user free_ram=9089MB free_disk=26.536449432373047GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70954) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 11:03:12 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:03:12 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:03:12 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 595d41a4-9a01-4aa2-96a1-c2c763475184 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 11:03:12 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 11:03:12 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 11:03:12 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 11:03:12 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 11:03:12 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Compute_service record updated for user:user {{(pid=70954) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 11:03:12 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.202s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:03:15 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:03:15 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:03:15 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:03:16 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:03:18 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:03:18 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:03:21 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:03:22 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:03:22 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70954) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 11:03:23 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:03:24 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:03:24 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Cleaning up deleted instances {{(pid=70954) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 21 11:03:24 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] There are 0 instances to clean {{(pid=70954) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 21 11:03:26 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:03:31 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:03:36 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:03:37 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:03:39 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:03:41 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:03:44 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:03:46 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:03:47 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:03:48 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:03:50 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Acquiring lock "4dc4e5a5-6f31-4466-a6bb-dae9a2a93585" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:03:50 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Lock "4dc4e5a5-6f31-4466-a6bb-dae9a2a93585" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:03:50 user nova-compute[70954]: DEBUG nova.compute.manager [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Starting instance... {{(pid=70954) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 11:03:50 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:03:50 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:03:50 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70954) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 11:03:50 user nova-compute[70954]: INFO nova.compute.claims [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Claim successful on node user Apr 21 11:03:50 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 11:03:50 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 11:03:51 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.230s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:03:51 user nova-compute[70954]: DEBUG nova.compute.manager [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Start building networks asynchronously for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 11:03:51 user nova-compute[70954]: DEBUG nova.compute.manager [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Allocating IP information in the background. {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 11:03:51 user nova-compute[70954]: DEBUG nova.network.neutron [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] allocate_for_instance() {{(pid=70954) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 11:03:51 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 11:03:51 user nova-compute[70954]: DEBUG nova.compute.manager [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Start building block device mappings for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 11:03:51 user nova-compute[70954]: DEBUG nova.compute.manager [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Start spawning the instance on the hypervisor. {{(pid=70954) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 11:03:51 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Creating instance directory {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 11:03:51 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Creating image(s) Apr 21 11:03:51 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Acquiring lock "/opt/stack/data/nova/instances/4dc4e5a5-6f31-4466-a6bb-dae9a2a93585/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:03:51 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Lock "/opt/stack/data/nova/instances/4dc4e5a5-6f31-4466-a6bb-dae9a2a93585/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:03:51 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Lock "/opt/stack/data/nova/instances/4dc4e5a5-6f31-4466-a6bb-dae9a2a93585/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.006s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:03:51 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:03:51 user nova-compute[70954]: DEBUG nova.policy [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b3ef4e7c36ed43d9a00f7b7b9731917e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '38ec5db9e7c744dcb2d4ae6737822da4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70954) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 11:03:51 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.134s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:03:51 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Acquiring lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:03:51 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:03:51 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:03:51 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.136s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:03:51 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/4dc4e5a5-6f31-4466-a6bb-dae9a2a93585/disk 1073741824 {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:03:51 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/4dc4e5a5-6f31-4466-a6bb-dae9a2a93585/disk 1073741824" returned: 0 in 0.047s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:03:51 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.188s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:03:51 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:03:51 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:03:51 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.135s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:03:51 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Checking if we can resize image /opt/stack/data/nova/instances/4dc4e5a5-6f31-4466-a6bb-dae9a2a93585/disk. size=1073741824 {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 11:03:51 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4dc4e5a5-6f31-4466-a6bb-dae9a2a93585/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:03:51 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4dc4e5a5-6f31-4466-a6bb-dae9a2a93585/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:03:51 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Cannot resize image /opt/stack/data/nova/instances/4dc4e5a5-6f31-4466-a6bb-dae9a2a93585/disk to a smaller size. {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 11:03:51 user nova-compute[70954]: DEBUG nova.objects.instance [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Lazy-loading 'migration_context' on Instance uuid 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 11:03:51 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Created local disks {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 11:03:51 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Ensure instance console log exists: /opt/stack/data/nova/instances/4dc4e5a5-6f31-4466-a6bb-dae9a2a93585/console.log {{(pid=70954) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 11:03:51 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:03:51 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:03:51 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:03:52 user nova-compute[70954]: DEBUG nova.network.neutron [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Successfully created port: f838d84d-a8eb-4c45-945a-ea4621cd1928 {{(pid=70954) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG nova.network.neutron [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Successfully updated port: f838d84d-a8eb-4c45-945a-ea4621cd1928 {{(pid=70954) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Acquiring lock "refresh_cache-4dc4e5a5-6f31-4466-a6bb-dae9a2a93585" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Acquired lock "refresh_cache-4dc4e5a5-6f31-4466-a6bb-dae9a2a93585" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG nova.network.neutron [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Building network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG nova.compute.manager [req-31ec8c18-34f1-4d75-8fc1-c89aab882594 req-bbbcf7f1-d39d-43c1-83f0-b262f64ac4ed service nova] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Received event network-changed-f838d84d-a8eb-4c45-945a-ea4621cd1928 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG nova.compute.manager [req-31ec8c18-34f1-4d75-8fc1-c89aab882594 req-bbbcf7f1-d39d-43c1-83f0-b262f64ac4ed service nova] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Refreshing instance network info cache due to event network-changed-f838d84d-a8eb-4c45-945a-ea4621cd1928. {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-31ec8c18-34f1-4d75-8fc1-c89aab882594 req-bbbcf7f1-d39d-43c1-83f0-b262f64ac4ed service nova] Acquiring lock "refresh_cache-4dc4e5a5-6f31-4466-a6bb-dae9a2a93585" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG nova.network.neutron [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Instance cache missing network info. {{(pid=70954) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG nova.network.neutron [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Updating instance_info_cache with network_info: [{"id": "f838d84d-a8eb-4c45-945a-ea4621cd1928", "address": "fa:16:3e:55:be:87", "network": {"id": "173a131c-e6e2-4f94-a591-9a96ff4967f2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-195371137-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "38ec5db9e7c744dcb2d4ae6737822da4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf838d84d-a8", "ovs_interfaceid": "f838d84d-a8eb-4c45-945a-ea4621cd1928", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Releasing lock "refresh_cache-4dc4e5a5-6f31-4466-a6bb-dae9a2a93585" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG nova.compute.manager [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Instance network_info: |[{"id": "f838d84d-a8eb-4c45-945a-ea4621cd1928", "address": "fa:16:3e:55:be:87", "network": {"id": "173a131c-e6e2-4f94-a591-9a96ff4967f2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-195371137-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "38ec5db9e7c744dcb2d4ae6737822da4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf838d84d-a8", "ovs_interfaceid": "f838d84d-a8eb-4c45-945a-ea4621cd1928", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-31ec8c18-34f1-4d75-8fc1-c89aab882594 req-bbbcf7f1-d39d-43c1-83f0-b262f64ac4ed service nova] Acquired lock "refresh_cache-4dc4e5a5-6f31-4466-a6bb-dae9a2a93585" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG nova.network.neutron [req-31ec8c18-34f1-4d75-8fc1-c89aab882594 req-bbbcf7f1-d39d-43c1-83f0-b262f64ac4ed service nova] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Refreshing network info cache for port f838d84d-a8eb-4c45-945a-ea4621cd1928 {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Start _get_guest_xml network_info=[{"id": "f838d84d-a8eb-4c45-945a-ea4621cd1928", "address": "fa:16:3e:55:be:87", "network": {"id": "173a131c-e6e2-4f94-a591-9a96ff4967f2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-195371137-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "38ec5db9e7c744dcb2d4ae6737822da4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf838d84d-a8", "ovs_interfaceid": "f838d84d-a8eb-4c45-945a-ea4621cd1928", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:43:25Z,direct_url=,disk_format='qcow2',id=3b29a01a-1fc0-4d0d-89fb-23d22b2de02e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='a3109aa78f014d0da3638064a889676d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:43:26Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'size': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'disk_bus': 'virtio', 'encryption_format': None, 'guest_format': None, 'image_id': '3b29a01a-1fc0-4d0d-89fb-23d22b2de02e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 11:03:53 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 11:03:53 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 11:03:53 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70954) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T10:44:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:43:25Z,direct_url=,disk_format='qcow2',id=3b29a01a-1fc0-4d0d-89fb-23d22b2de02e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='a3109aa78f014d0da3638064a889676d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:43:26Z,virtual_size=,visibility=), allow threads: True {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Flavor limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Image limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Flavor pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Image pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Got 1 possible topologies {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T11:03:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-2066233937',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-2066233937',id=24,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMZhnojnQmcZDKi08FqYsbblwCAdGeKVIOD67QqUH/+9e68mmZSA8wvYLSVOq1Y8R7Sh5PCbGPREdIuVm7tQRs5SvoKAjUvEcLyGAJtCQ4wo+oC1c20rBKxLkZu2wABQhQ==',key_name='tempest-keypair-74686879',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='38ec5db9e7c744dcb2d4ae6737822da4',ramdisk_id='',reservation_id='r-1srjkynt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-1477853719',owner_user_name='tempest-AttachVolumeShelveTestJSON-1477853719-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T11:03:51Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b3ef4e7c36ed43d9a00f7b7b9731917e',uuid=4dc4e5a5-6f31-4466-a6bb-dae9a2a93585,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f838d84d-a8eb-4c45-945a-ea4621cd1928", "address": "fa:16:3e:55:be:87", "network": {"id": "173a131c-e6e2-4f94-a591-9a96ff4967f2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-195371137-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "38ec5db9e7c744dcb2d4ae6737822da4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf838d84d-a8", "ovs_interfaceid": "f838d84d-a8eb-4c45-945a-ea4621cd1928", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70954) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Converting VIF {"id": "f838d84d-a8eb-4c45-945a-ea4621cd1928", "address": "fa:16:3e:55:be:87", "network": {"id": "173a131c-e6e2-4f94-a591-9a96ff4967f2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-195371137-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "38ec5db9e7c744dcb2d4ae6737822da4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf838d84d-a8", "ovs_interfaceid": "f838d84d-a8eb-4c45-945a-ea4621cd1928", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:55:be:87,bridge_name='br-int',has_traffic_filtering=True,id=f838d84d-a8eb-4c45-945a-ea4621cd1928,network=Network(173a131c-e6e2-4f94-a591-9a96ff4967f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf838d84d-a8') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG nova.objects.instance [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Lazy-loading 'pci_devices' on Instance uuid 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] End _get_guest_xml xml= Apr 21 11:03:53 user nova-compute[70954]: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585 Apr 21 11:03:53 user nova-compute[70954]: instance-00000018 Apr 21 11:03:53 user nova-compute[70954]: 131072 Apr 21 11:03:53 user nova-compute[70954]: 1 Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: tempest-AttachVolumeShelveTestJSON-server-2066233937 Apr 21 11:03:53 user nova-compute[70954]: 2023-04-21 11:03:53 Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: 128 Apr 21 11:03:53 user nova-compute[70954]: 1 Apr 21 11:03:53 user nova-compute[70954]: 0 Apr 21 11:03:53 user nova-compute[70954]: 0 Apr 21 11:03:53 user nova-compute[70954]: 1 Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: tempest-AttachVolumeShelveTestJSON-1477853719-project-member Apr 21 11:03:53 user nova-compute[70954]: tempest-AttachVolumeShelveTestJSON-1477853719 Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: OpenStack Foundation Apr 21 11:03:53 user nova-compute[70954]: OpenStack Nova Apr 21 11:03:53 user nova-compute[70954]: 0.0.0 Apr 21 11:03:53 user nova-compute[70954]: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585 Apr 21 11:03:53 user nova-compute[70954]: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585 Apr 21 11:03:53 user nova-compute[70954]: Virtual Machine Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: hvm Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Nehalem Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: /dev/urandom Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: Apr 21 11:03:53 user nova-compute[70954]: {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T11:03:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-2066233937',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-2066233937',id=24,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMZhnojnQmcZDKi08FqYsbblwCAdGeKVIOD67QqUH/+9e68mmZSA8wvYLSVOq1Y8R7Sh5PCbGPREdIuVm7tQRs5SvoKAjUvEcLyGAJtCQ4wo+oC1c20rBKxLkZu2wABQhQ==',key_name='tempest-keypair-74686879',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='38ec5db9e7c744dcb2d4ae6737822da4',ramdisk_id='',reservation_id='r-1srjkynt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-1477853719',owner_user_name='tempest-AttachVolumeShelveTestJSON-1477853719-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T11:03:51Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b3ef4e7c36ed43d9a00f7b7b9731917e',uuid=4dc4e5a5-6f31-4466-a6bb-dae9a2a93585,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f838d84d-a8eb-4c45-945a-ea4621cd1928", "address": "fa:16:3e:55:be:87", "network": {"id": "173a131c-e6e2-4f94-a591-9a96ff4967f2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-195371137-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "38ec5db9e7c744dcb2d4ae6737822da4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf838d84d-a8", "ovs_interfaceid": "f838d84d-a8eb-4c45-945a-ea4621cd1928", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Converting VIF {"id": "f838d84d-a8eb-4c45-945a-ea4621cd1928", "address": "fa:16:3e:55:be:87", "network": {"id": "173a131c-e6e2-4f94-a591-9a96ff4967f2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-195371137-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "38ec5db9e7c744dcb2d4ae6737822da4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf838d84d-a8", "ovs_interfaceid": "f838d84d-a8eb-4c45-945a-ea4621cd1928", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:55:be:87,bridge_name='br-int',has_traffic_filtering=True,id=f838d84d-a8eb-4c45-945a-ea4621cd1928,network=Network(173a131c-e6e2-4f94-a591-9a96ff4967f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf838d84d-a8') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG os_vif [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:55:be:87,bridge_name='br-int',has_traffic_filtering=True,id=f838d84d-a8eb-4c45-945a-ea4621cd1928,network=Network(173a131c-e6e2-4f94-a591-9a96ff4967f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf838d84d-a8') {{(pid=70954) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf838d84d-a8, may_exist=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf838d84d-a8, col_values=(('external_ids', {'iface-id': 'f838d84d-a8eb-4c45-945a-ea4621cd1928', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:55:be:87', 'vm-uuid': '4dc4e5a5-6f31-4466-a6bb-dae9a2a93585'}),)) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:03:53 user nova-compute[70954]: INFO os_vif [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:55:be:87,bridge_name='br-int',has_traffic_filtering=True,id=f838d84d-a8eb-4c45-945a-ea4621cd1928,network=Network(173a131c-e6e2-4f94-a591-9a96ff4967f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf838d84d-a8') Apr 21 11:03:53 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] No BDM found with device name vda, not building metadata. {{(pid=70954) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 11:03:53 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] No VIF found with MAC fa:16:3e:55:be:87, not building metadata {{(pid=70954) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 11:03:54 user nova-compute[70954]: DEBUG nova.network.neutron [req-31ec8c18-34f1-4d75-8fc1-c89aab882594 req-bbbcf7f1-d39d-43c1-83f0-b262f64ac4ed service nova] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Updated VIF entry in instance network info cache for port f838d84d-a8eb-4c45-945a-ea4621cd1928. {{(pid=70954) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 11:03:54 user nova-compute[70954]: DEBUG nova.network.neutron [req-31ec8c18-34f1-4d75-8fc1-c89aab882594 req-bbbcf7f1-d39d-43c1-83f0-b262f64ac4ed service nova] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Updating instance_info_cache with network_info: [{"id": "f838d84d-a8eb-4c45-945a-ea4621cd1928", "address": "fa:16:3e:55:be:87", "network": {"id": "173a131c-e6e2-4f94-a591-9a96ff4967f2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-195371137-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "38ec5db9e7c744dcb2d4ae6737822da4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf838d84d-a8", "ovs_interfaceid": "f838d84d-a8eb-4c45-945a-ea4621cd1928", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 11:03:54 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-31ec8c18-34f1-4d75-8fc1-c89aab882594 req-bbbcf7f1-d39d-43c1-83f0-b262f64ac4ed service nova] Releasing lock "refresh_cache-4dc4e5a5-6f31-4466-a6bb-dae9a2a93585" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 11:03:55 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:03:55 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:03:55 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:03:55 user nova-compute[70954]: DEBUG nova.compute.manager [req-81f4acca-749e-44fd-8af5-4bfd2725a45c req-d1355642-2710-4b2a-95d5-87d80d516f89 service nova] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Received event network-vif-plugged-f838d84d-a8eb-4c45-945a-ea4621cd1928 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 11:03:55 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-81f4acca-749e-44fd-8af5-4bfd2725a45c req-d1355642-2710-4b2a-95d5-87d80d516f89 service nova] Acquiring lock "4dc4e5a5-6f31-4466-a6bb-dae9a2a93585-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:03:55 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-81f4acca-749e-44fd-8af5-4bfd2725a45c req-d1355642-2710-4b2a-95d5-87d80d516f89 service nova] Lock "4dc4e5a5-6f31-4466-a6bb-dae9a2a93585-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:03:55 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-81f4acca-749e-44fd-8af5-4bfd2725a45c req-d1355642-2710-4b2a-95d5-87d80d516f89 service nova] Lock "4dc4e5a5-6f31-4466-a6bb-dae9a2a93585-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:03:55 user nova-compute[70954]: DEBUG nova.compute.manager [req-81f4acca-749e-44fd-8af5-4bfd2725a45c req-d1355642-2710-4b2a-95d5-87d80d516f89 service nova] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] No waiting events found dispatching network-vif-plugged-f838d84d-a8eb-4c45-945a-ea4621cd1928 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 11:03:55 user nova-compute[70954]: WARNING nova.compute.manager [req-81f4acca-749e-44fd-8af5-4bfd2725a45c req-d1355642-2710-4b2a-95d5-87d80d516f89 service nova] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Received unexpected event network-vif-plugged-f838d84d-a8eb-4c45-945a-ea4621cd1928 for instance with vm_state building and task_state spawning. Apr 21 11:03:55 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:03:55 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:03:55 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:03:57 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Resumed> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 11:03:57 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] VM Resumed (Lifecycle Event) Apr 21 11:03:57 user nova-compute[70954]: DEBUG nova.compute.manager [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Instance event wait completed in 0 seconds for {{(pid=70954) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 11:03:57 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Guest created on hypervisor {{(pid=70954) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 11:03:57 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Instance spawned successfully. Apr 21 11:03:57 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 11:03:57 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 11:03:57 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 11:03:57 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Found default for hw_cdrom_bus of ide {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 11:03:57 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Found default for hw_disk_bus of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 11:03:57 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Found default for hw_input_bus of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 11:03:57 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Found default for hw_pointer_model of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 11:03:57 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Found default for hw_video_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 11:03:57 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Found default for hw_vif_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 11:03:57 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 11:03:57 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Started> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 11:03:57 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] VM Started (Lifecycle Event) Apr 21 11:03:57 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 11:03:57 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 11:03:57 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 11:03:57 user nova-compute[70954]: INFO nova.compute.manager [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Took 6.10 seconds to spawn the instance on the hypervisor. Apr 21 11:03:57 user nova-compute[70954]: DEBUG nova.compute.manager [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 11:03:57 user nova-compute[70954]: DEBUG nova.compute.manager [req-627ff99d-3fc4-4903-ba86-9e6d45311450 req-bf86a5c5-0920-4d52-a6f4-40a598625e22 service nova] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Received event network-vif-plugged-f838d84d-a8eb-4c45-945a-ea4621cd1928 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 11:03:57 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-627ff99d-3fc4-4903-ba86-9e6d45311450 req-bf86a5c5-0920-4d52-a6f4-40a598625e22 service nova] Acquiring lock "4dc4e5a5-6f31-4466-a6bb-dae9a2a93585-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:03:57 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-627ff99d-3fc4-4903-ba86-9e6d45311450 req-bf86a5c5-0920-4d52-a6f4-40a598625e22 service nova] Lock "4dc4e5a5-6f31-4466-a6bb-dae9a2a93585-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:03:57 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-627ff99d-3fc4-4903-ba86-9e6d45311450 req-bf86a5c5-0920-4d52-a6f4-40a598625e22 service nova] Lock "4dc4e5a5-6f31-4466-a6bb-dae9a2a93585-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:03:57 user nova-compute[70954]: DEBUG nova.compute.manager [req-627ff99d-3fc4-4903-ba86-9e6d45311450 req-bf86a5c5-0920-4d52-a6f4-40a598625e22 service nova] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] No waiting events found dispatching network-vif-plugged-f838d84d-a8eb-4c45-945a-ea4621cd1928 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 11:03:57 user nova-compute[70954]: WARNING nova.compute.manager [req-627ff99d-3fc4-4903-ba86-9e6d45311450 req-bf86a5c5-0920-4d52-a6f4-40a598625e22 service nova] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Received unexpected event network-vif-plugged-f838d84d-a8eb-4c45-945a-ea4621cd1928 for instance with vm_state building and task_state spawning. Apr 21 11:03:57 user nova-compute[70954]: INFO nova.compute.manager [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Took 6.69 seconds to build instance. Apr 21 11:03:57 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-ad58db54-7980-4844-bf65-857277d3a9ab tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Lock "4dc4e5a5-6f31-4466-a6bb-dae9a2a93585" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.786s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:03:58 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:03:58 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:04:02 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._sync_power_states {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:04:02 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Triggering sync for uuid 595d41a4-9a01-4aa2-96a1-c2c763475184 {{(pid=70954) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 21 11:04:02 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Triggering sync for uuid 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585 {{(pid=70954) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 21 11:04:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "595d41a4-9a01-4aa2-96a1-c2c763475184" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:04:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "595d41a4-9a01-4aa2-96a1-c2c763475184" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:04:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "4dc4e5a5-6f31-4466-a6bb-dae9a2a93585" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:04:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "4dc4e5a5-6f31-4466-a6bb-dae9a2a93585" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:04:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "595d41a4-9a01-4aa2-96a1-c2c763475184" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.026s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:04:02 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "4dc4e5a5-6f31-4466-a6bb-dae9a2a93585" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.028s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:04:03 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:04:03 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:04:08 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:04:10 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:04:10 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Starting heal instance info cache {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 11:04:10 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Rebuilding the list of instances to heal {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 21 11:04:10 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "refresh_cache-595d41a4-9a01-4aa2-96a1-c2c763475184" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 11:04:10 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquired lock "refresh_cache-595d41a4-9a01-4aa2-96a1-c2c763475184" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 11:04:10 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Forcefully refreshing network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 21 11:04:10 user nova-compute[70954]: DEBUG nova.objects.instance [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lazy-loading 'info_cache' on Instance uuid 595d41a4-9a01-4aa2-96a1-c2c763475184 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 11:04:11 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Updating instance_info_cache with network_info: [{"id": "eb0b0125-965b-4825-aab1-3ba81be44c2f", "address": "fa:16:3e:13:39:c1", "network": {"id": "3e633eed-7c28-4111-849c-3ab0f46c0c5c", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1483635329-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "acc12d15daf34c5e9d26a6cc53795efe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb0b0125-96", "ovs_interfaceid": "eb0b0125-965b-4825-aab1-3ba81be44c2f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 11:04:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Releasing lock "refresh_cache-595d41a4-9a01-4aa2-96a1-c2c763475184" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 11:04:11 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Updated the network info_cache for instance {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 21 11:04:11 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:04:12 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager.update_available_resource {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:04:12 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:04:12 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:04:12 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:04:12 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Auditing locally available compute resources for user (node: user) {{(pid=70954) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 11:04:12 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:04:13 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:04:13 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:04:13 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:04:13 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4dc4e5a5-6f31-4466-a6bb-dae9a2a93585/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:04:13 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:04:13 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4dc4e5a5-6f31-4466-a6bb-dae9a2a93585/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:04:13 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4dc4e5a5-6f31-4466-a6bb-dae9a2a93585/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:04:13 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4dc4e5a5-6f31-4466-a6bb-dae9a2a93585/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:04:13 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:04:13 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 11:04:13 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 11:04:13 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Hypervisor/Node resource view: name=user free_ram=8951MB free_disk=26.514373779296875GB free_vcpus=10 pci_devices=[{"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70954) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 11:04:13 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:04:13 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:04:14 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 595d41a4-9a01-4aa2-96a1-c2c763475184 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 11:04:14 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 11:04:14 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Total usable vcpus: 12, total allocated vcpus: 2 {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 11:04:14 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Final resource view: name=user phys_ram=16023MB used_ram=768MB phys_disk=40GB used_disk=2GB total_vcpus=12 used_vcpus=2 pci_stats=[] {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 11:04:14 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Refreshing inventories for resource provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 21 11:04:14 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Updating ProviderTree inventory for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 21 11:04:14 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Updating inventory in ProviderTree for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 21 11:04:14 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Refreshing aggregate associations for resource provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97, aggregates: None {{(pid=70954) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 21 11:04:14 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Refreshing trait associations for resource provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97, traits: COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,HW_CPU_X86_SSE,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_USB,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT {{(pid=70954) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 21 11:04:14 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 11:04:14 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 11:04:14 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Compute_service record updated for user:user {{(pid=70954) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 11:04:14 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.389s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:04:15 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:04:15 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:04:15 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:04:16 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:04:18 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:04:18 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:04:20 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:04:23 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:04:23 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:04:23 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70954) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 11:04:23 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:04:28 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:04:28 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:04:28 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=70954) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 21 11:04:28 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 11:04:28 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 11:04:28 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:04:33 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:04:33 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:04:33 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=70954) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 21 11:04:33 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 11:04:33 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 11:04:33 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:04:38 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:04:38 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:04:38 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=70954) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 21 11:04:38 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 11:04:38 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 11:04:38 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:04:43 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:04:43 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:04:48 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:04:53 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:04:53 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:04:58 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:04:58 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:04:58 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=70954) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 21 11:04:58 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 11:04:58 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 11:04:58 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:05:03 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:05:03 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:05:03 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=70954) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 21 11:05:03 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 11:05:03 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 11:05:03 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:05:08 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:05:08 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:05:08 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=70954) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 21 11:05:08 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 11:05:08 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 11:05:08 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:05:11 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:05:11 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Starting heal instance info cache {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 11:05:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "refresh_cache-4dc4e5a5-6f31-4466-a6bb-dae9a2a93585" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 11:05:11 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquired lock "refresh_cache-4dc4e5a5-6f31-4466-a6bb-dae9a2a93585" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 11:05:11 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Forcefully refreshing network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 21 11:05:12 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Updating instance_info_cache with network_info: [{"id": "f838d84d-a8eb-4c45-945a-ea4621cd1928", "address": "fa:16:3e:55:be:87", "network": {"id": "173a131c-e6e2-4f94-a591-9a96ff4967f2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-195371137-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "38ec5db9e7c744dcb2d4ae6737822da4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf838d84d-a8", "ovs_interfaceid": "f838d84d-a8eb-4c45-945a-ea4621cd1928", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 11:05:12 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Releasing lock "refresh_cache-4dc4e5a5-6f31-4466-a6bb-dae9a2a93585" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 11:05:12 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Updated the network info_cache for instance {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 21 11:05:12 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-b672a9a9-3768-4376-bd57-dff15dc636e3 tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Acquiring lock "595d41a4-9a01-4aa2-96a1-c2c763475184" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:05:12 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-b672a9a9-3768-4376-bd57-dff15dc636e3 tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Lock "595d41a4-9a01-4aa2-96a1-c2c763475184" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:05:12 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-b672a9a9-3768-4376-bd57-dff15dc636e3 tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Acquiring lock "595d41a4-9a01-4aa2-96a1-c2c763475184-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:05:12 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-b672a9a9-3768-4376-bd57-dff15dc636e3 tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Lock "595d41a4-9a01-4aa2-96a1-c2c763475184-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:05:12 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-b672a9a9-3768-4376-bd57-dff15dc636e3 tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Lock "595d41a4-9a01-4aa2-96a1-c2c763475184-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:05:12 user nova-compute[70954]: INFO nova.compute.manager [None req-b672a9a9-3768-4376-bd57-dff15dc636e3 tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Terminating instance Apr 21 11:05:12 user nova-compute[70954]: DEBUG nova.compute.manager [None req-b672a9a9-3768-4376-bd57-dff15dc636e3 tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Start destroying the instance on the hypervisor. {{(pid=70954) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 11:05:12 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:05:12 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:05:12 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:05:12 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:05:13 user nova-compute[70954]: DEBUG nova.compute.manager [req-c540df67-89db-4cb8-8ae0-067ad3f9a26c req-6f3159b5-bf8a-4e1f-8c0e-9b80c0525b03 service nova] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Received event network-vif-unplugged-eb0b0125-965b-4825-aab1-3ba81be44c2f {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 11:05:13 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-c540df67-89db-4cb8-8ae0-067ad3f9a26c req-6f3159b5-bf8a-4e1f-8c0e-9b80c0525b03 service nova] Acquiring lock "595d41a4-9a01-4aa2-96a1-c2c763475184-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:05:13 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-c540df67-89db-4cb8-8ae0-067ad3f9a26c req-6f3159b5-bf8a-4e1f-8c0e-9b80c0525b03 service nova] Lock "595d41a4-9a01-4aa2-96a1-c2c763475184-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:05:13 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-c540df67-89db-4cb8-8ae0-067ad3f9a26c req-6f3159b5-bf8a-4e1f-8c0e-9b80c0525b03 service nova] Lock "595d41a4-9a01-4aa2-96a1-c2c763475184-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:05:13 user nova-compute[70954]: DEBUG nova.compute.manager [req-c540df67-89db-4cb8-8ae0-067ad3f9a26c req-6f3159b5-bf8a-4e1f-8c0e-9b80c0525b03 service nova] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] No waiting events found dispatching network-vif-unplugged-eb0b0125-965b-4825-aab1-3ba81be44c2f {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 11:05:13 user nova-compute[70954]: DEBUG nova.compute.manager [req-c540df67-89db-4cb8-8ae0-067ad3f9a26c req-6f3159b5-bf8a-4e1f-8c0e-9b80c0525b03 service nova] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Received event network-vif-unplugged-eb0b0125-965b-4825-aab1-3ba81be44c2f for instance with task_state deleting. {{(pid=70954) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 11:05:13 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:05:13 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Instance destroyed successfully. Apr 21 11:05:13 user nova-compute[70954]: DEBUG nova.objects.instance [None req-b672a9a9-3768-4376-bd57-dff15dc636e3 tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Lazy-loading 'resources' on Instance uuid 595d41a4-9a01-4aa2-96a1-c2c763475184 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 11:05:13 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-b672a9a9-3768-4376-bd57-dff15dc636e3 tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T10:56:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-2123160889',display_name='tempest-ServersNegativeTestJSON-server-2123160889',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-2123160889',id=19,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-21T10:56:41Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='acc12d15daf34c5e9d26a6cc53795efe',ramdisk_id='',reservation_id='r-1dp2q30u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServersNegativeTestJSON-1265836442',owner_user_name='tempest-ServersNegativeTestJSON-1265836442-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T10:56:41Z,user_data=None,user_id='95956d2e4ea84534b6d5628eb8dd184d',uuid=595d41a4-9a01-4aa2-96a1-c2c763475184,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "eb0b0125-965b-4825-aab1-3ba81be44c2f", "address": "fa:16:3e:13:39:c1", "network": {"id": "3e633eed-7c28-4111-849c-3ab0f46c0c5c", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1483635329-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "acc12d15daf34c5e9d26a6cc53795efe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb0b0125-96", "ovs_interfaceid": "eb0b0125-965b-4825-aab1-3ba81be44c2f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 11:05:13 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-b672a9a9-3768-4376-bd57-dff15dc636e3 tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Converting VIF {"id": "eb0b0125-965b-4825-aab1-3ba81be44c2f", "address": "fa:16:3e:13:39:c1", "network": {"id": "3e633eed-7c28-4111-849c-3ab0f46c0c5c", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1483635329-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "acc12d15daf34c5e9d26a6cc53795efe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb0b0125-96", "ovs_interfaceid": "eb0b0125-965b-4825-aab1-3ba81be44c2f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 11:05:13 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-b672a9a9-3768-4376-bd57-dff15dc636e3 tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:13:39:c1,bridge_name='br-int',has_traffic_filtering=True,id=eb0b0125-965b-4825-aab1-3ba81be44c2f,network=Network(3e633eed-7c28-4111-849c-3ab0f46c0c5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeb0b0125-96') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 11:05:13 user nova-compute[70954]: DEBUG os_vif [None req-b672a9a9-3768-4376-bd57-dff15dc636e3 tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:13:39:c1,bridge_name='br-int',has_traffic_filtering=True,id=eb0b0125-965b-4825-aab1-3ba81be44c2f,network=Network(3e633eed-7c28-4111-849c-3ab0f46c0c5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeb0b0125-96') {{(pid=70954) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 11:05:13 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:05:13 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:05:13 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeb0b0125-96, bridge=br-int, if_exists=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 11:05:13 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:05:13 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:05:13 user nova-compute[70954]: INFO os_vif [None req-b672a9a9-3768-4376-bd57-dff15dc636e3 tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:13:39:c1,bridge_name='br-int',has_traffic_filtering=True,id=eb0b0125-965b-4825-aab1-3ba81be44c2f,network=Network(3e633eed-7c28-4111-849c-3ab0f46c0c5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeb0b0125-96') Apr 21 11:05:13 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-b672a9a9-3768-4376-bd57-dff15dc636e3 tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Deleting instance files /opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184_del Apr 21 11:05:13 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-b672a9a9-3768-4376-bd57-dff15dc636e3 tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Deletion of /opt/stack/data/nova/instances/595d41a4-9a01-4aa2-96a1-c2c763475184_del complete Apr 21 11:05:13 user nova-compute[70954]: INFO nova.compute.manager [None req-b672a9a9-3768-4376-bd57-dff15dc636e3 tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Took 0.84 seconds to destroy the instance on the hypervisor. Apr 21 11:05:13 user nova-compute[70954]: DEBUG oslo.service.loopingcall [None req-b672a9a9-3768-4376-bd57-dff15dc636e3 tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70954) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 11:05:13 user nova-compute[70954]: DEBUG nova.compute.manager [-] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Deallocating network for instance {{(pid=70954) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 11:05:13 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] deallocate_for_instance() {{(pid=70954) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 11:05:13 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Updating instance_info_cache with network_info: [] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 11:05:13 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Took 0.47 seconds to deallocate network for instance. Apr 21 11:05:13 user nova-compute[70954]: DEBUG nova.compute.manager [req-c03cbc15-b40c-4d6c-8a21-963b7dba1e97 req-f93f9b8a-656a-473c-96ed-e1c356e93cda service nova] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Received event network-vif-deleted-eb0b0125-965b-4825-aab1-3ba81be44c2f {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 11:05:14 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-b672a9a9-3768-4376-bd57-dff15dc636e3 tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:05:14 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-b672a9a9-3768-4376-bd57-dff15dc636e3 tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:05:14 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-b672a9a9-3768-4376-bd57-dff15dc636e3 tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 11:05:14 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-b672a9a9-3768-4376-bd57-dff15dc636e3 tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 11:05:14 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-b672a9a9-3768-4376-bd57-dff15dc636e3 tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.203s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:05:14 user nova-compute[70954]: INFO nova.scheduler.client.report [None req-b672a9a9-3768-4376-bd57-dff15dc636e3 tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Deleted allocations for instance 595d41a4-9a01-4aa2-96a1-c2c763475184 Apr 21 11:05:14 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-b672a9a9-3768-4376-bd57-dff15dc636e3 tempest-ServersNegativeTestJSON-1265836442 tempest-ServersNegativeTestJSON-1265836442-project-member] Lock "595d41a4-9a01-4aa2-96a1-c2c763475184" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.755s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:05:14 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:05:14 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager.update_available_resource {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:05:14 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:05:14 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:05:14 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:05:14 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Auditing locally available compute resources for user (node: user) {{(pid=70954) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 11:05:14 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4dc4e5a5-6f31-4466-a6bb-dae9a2a93585/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:05:15 user nova-compute[70954]: DEBUG nova.compute.manager [req-45bfcab5-2e77-4508-92ef-0ef9039454a1 req-62ce40ea-a060-402a-97f2-c459ec3db614 service nova] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Received event network-vif-plugged-eb0b0125-965b-4825-aab1-3ba81be44c2f {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 11:05:15 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-45bfcab5-2e77-4508-92ef-0ef9039454a1 req-62ce40ea-a060-402a-97f2-c459ec3db614 service nova] Acquiring lock "595d41a4-9a01-4aa2-96a1-c2c763475184-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:05:15 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-45bfcab5-2e77-4508-92ef-0ef9039454a1 req-62ce40ea-a060-402a-97f2-c459ec3db614 service nova] Lock "595d41a4-9a01-4aa2-96a1-c2c763475184-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:05:15 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-45bfcab5-2e77-4508-92ef-0ef9039454a1 req-62ce40ea-a060-402a-97f2-c459ec3db614 service nova] Lock "595d41a4-9a01-4aa2-96a1-c2c763475184-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:05:15 user nova-compute[70954]: DEBUG nova.compute.manager [req-45bfcab5-2e77-4508-92ef-0ef9039454a1 req-62ce40ea-a060-402a-97f2-c459ec3db614 service nova] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] No waiting events found dispatching network-vif-plugged-eb0b0125-965b-4825-aab1-3ba81be44c2f {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 11:05:15 user nova-compute[70954]: WARNING nova.compute.manager [req-45bfcab5-2e77-4508-92ef-0ef9039454a1 req-62ce40ea-a060-402a-97f2-c459ec3db614 service nova] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Received unexpected event network-vif-plugged-eb0b0125-965b-4825-aab1-3ba81be44c2f for instance with vm_state deleted and task_state None. Apr 21 11:05:15 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4dc4e5a5-6f31-4466-a6bb-dae9a2a93585/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:05:15 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4dc4e5a5-6f31-4466-a6bb-dae9a2a93585/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:05:15 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4dc4e5a5-6f31-4466-a6bb-dae9a2a93585/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:05:15 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 11:05:15 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 11:05:15 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Hypervisor/Node resource view: name=user free_ram=9082MB free_disk=26.53313446044922GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70954) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 11:05:15 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:05:15 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:05:15 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 11:05:15 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 11:05:15 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 11:05:15 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 11:05:15 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 11:05:15 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Compute_service record updated for user:user {{(pid=70954) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 11:05:15 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.278s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:05:16 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:05:16 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:05:17 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:05:18 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:05:18 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:05:20 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:05:23 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:05:25 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:05:25 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70954) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 11:05:28 user nova-compute[70954]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 11:05:28 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] VM Stopped (Lifecycle Event) Apr 21 11:05:28 user nova-compute[70954]: DEBUG nova.compute.manager [None req-08a088f7-974b-4042-b471-b309b77dc3b9 None None] [instance: 595d41a4-9a01-4aa2-96a1-c2c763475184] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 11:05:28 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:05:28 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:05:28 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=70954) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 21 11:05:28 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 11:05:28 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 11:05:28 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:05:33 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:05:38 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:05:41 user nova-compute[70954]: DEBUG nova.compute.manager [req-b945d0f9-85af-4984-b4d0-3ff3cc208497 req-488309d3-6a8f-40ec-91ee-730be96550fc service nova] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Received event network-changed-f838d84d-a8eb-4c45-945a-ea4621cd1928 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 11:05:41 user nova-compute[70954]: DEBUG nova.compute.manager [req-b945d0f9-85af-4984-b4d0-3ff3cc208497 req-488309d3-6a8f-40ec-91ee-730be96550fc service nova] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Refreshing instance network info cache due to event network-changed-f838d84d-a8eb-4c45-945a-ea4621cd1928. {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 11:05:41 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-b945d0f9-85af-4984-b4d0-3ff3cc208497 req-488309d3-6a8f-40ec-91ee-730be96550fc service nova] Acquiring lock "refresh_cache-4dc4e5a5-6f31-4466-a6bb-dae9a2a93585" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 11:05:41 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-b945d0f9-85af-4984-b4d0-3ff3cc208497 req-488309d3-6a8f-40ec-91ee-730be96550fc service nova] Acquired lock "refresh_cache-4dc4e5a5-6f31-4466-a6bb-dae9a2a93585" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 11:05:41 user nova-compute[70954]: DEBUG nova.network.neutron [req-b945d0f9-85af-4984-b4d0-3ff3cc208497 req-488309d3-6a8f-40ec-91ee-730be96550fc service nova] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Refreshing network info cache for port f838d84d-a8eb-4c45-945a-ea4621cd1928 {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 11:05:42 user nova-compute[70954]: DEBUG nova.network.neutron [req-b945d0f9-85af-4984-b4d0-3ff3cc208497 req-488309d3-6a8f-40ec-91ee-730be96550fc service nova] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Updated VIF entry in instance network info cache for port f838d84d-a8eb-4c45-945a-ea4621cd1928. {{(pid=70954) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 11:05:42 user nova-compute[70954]: DEBUG nova.network.neutron [req-b945d0f9-85af-4984-b4d0-3ff3cc208497 req-488309d3-6a8f-40ec-91ee-730be96550fc service nova] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Updating instance_info_cache with network_info: [{"id": "f838d84d-a8eb-4c45-945a-ea4621cd1928", "address": "fa:16:3e:55:be:87", "network": {"id": "173a131c-e6e2-4f94-a591-9a96ff4967f2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-195371137-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.77", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "38ec5db9e7c744dcb2d4ae6737822da4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf838d84d-a8", "ovs_interfaceid": "f838d84d-a8eb-4c45-945a-ea4621cd1928", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 11:05:42 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-b945d0f9-85af-4984-b4d0-3ff3cc208497 req-488309d3-6a8f-40ec-91ee-730be96550fc service nova] Releasing lock "refresh_cache-4dc4e5a5-6f31-4466-a6bb-dae9a2a93585" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 11:05:43 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-73e07410-435a-46b0-841a-f80a1ebe6717 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Acquiring lock "4dc4e5a5-6f31-4466-a6bb-dae9a2a93585" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:05:43 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-73e07410-435a-46b0-841a-f80a1ebe6717 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Lock "4dc4e5a5-6f31-4466-a6bb-dae9a2a93585" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:05:43 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-73e07410-435a-46b0-841a-f80a1ebe6717 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Acquiring lock "4dc4e5a5-6f31-4466-a6bb-dae9a2a93585-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:05:43 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-73e07410-435a-46b0-841a-f80a1ebe6717 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Lock "4dc4e5a5-6f31-4466-a6bb-dae9a2a93585-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:05:43 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-73e07410-435a-46b0-841a-f80a1ebe6717 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Lock "4dc4e5a5-6f31-4466-a6bb-dae9a2a93585-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:05:43 user nova-compute[70954]: INFO nova.compute.manager [None req-73e07410-435a-46b0-841a-f80a1ebe6717 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Terminating instance Apr 21 11:05:43 user nova-compute[70954]: DEBUG nova.compute.manager [None req-73e07410-435a-46b0-841a-f80a1ebe6717 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Start destroying the instance on the hypervisor. {{(pid=70954) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 11:05:43 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:05:43 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:05:43 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:05:43 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:05:43 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:05:43 user nova-compute[70954]: DEBUG nova.compute.manager [req-a0858c48-0d52-46e1-a135-a530e450e671 req-7da3f055-2315-4816-8849-743205565452 service nova] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Received event network-vif-unplugged-f838d84d-a8eb-4c45-945a-ea4621cd1928 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 11:05:43 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-a0858c48-0d52-46e1-a135-a530e450e671 req-7da3f055-2315-4816-8849-743205565452 service nova] Acquiring lock "4dc4e5a5-6f31-4466-a6bb-dae9a2a93585-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:05:43 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-a0858c48-0d52-46e1-a135-a530e450e671 req-7da3f055-2315-4816-8849-743205565452 service nova] Lock "4dc4e5a5-6f31-4466-a6bb-dae9a2a93585-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:05:43 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-a0858c48-0d52-46e1-a135-a530e450e671 req-7da3f055-2315-4816-8849-743205565452 service nova] Lock "4dc4e5a5-6f31-4466-a6bb-dae9a2a93585-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:05:43 user nova-compute[70954]: DEBUG nova.compute.manager [req-a0858c48-0d52-46e1-a135-a530e450e671 req-7da3f055-2315-4816-8849-743205565452 service nova] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] No waiting events found dispatching network-vif-unplugged-f838d84d-a8eb-4c45-945a-ea4621cd1928 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 11:05:43 user nova-compute[70954]: DEBUG nova.compute.manager [req-a0858c48-0d52-46e1-a135-a530e450e671 req-7da3f055-2315-4816-8849-743205565452 service nova] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Received event network-vif-unplugged-f838d84d-a8eb-4c45-945a-ea4621cd1928 for instance with task_state deleting. {{(pid=70954) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 11:05:43 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Instance destroyed successfully. Apr 21 11:05:43 user nova-compute[70954]: DEBUG nova.objects.instance [None req-73e07410-435a-46b0-841a-f80a1ebe6717 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Lazy-loading 'resources' on Instance uuid 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585 {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 11:05:43 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-73e07410-435a-46b0-841a-f80a1ebe6717 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T11:03:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-2066233937',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-2066233937',id=24,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMZhnojnQmcZDKi08FqYsbblwCAdGeKVIOD67QqUH/+9e68mmZSA8wvYLSVOq1Y8R7Sh5PCbGPREdIuVm7tQRs5SvoKAjUvEcLyGAJtCQ4wo+oC1c20rBKxLkZu2wABQhQ==',key_name='tempest-keypair-74686879',keypairs=,launch_index=0,launched_at=2023-04-21T11:03:57Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='38ec5db9e7c744dcb2d4ae6737822da4',ramdisk_id='',reservation_id='r-1srjkynt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeShelveTestJSON-1477853719',owner_user_name='tempest-AttachVolumeShelveTestJSON-1477853719-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T11:03:57Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b3ef4e7c36ed43d9a00f7b7b9731917e',uuid=4dc4e5a5-6f31-4466-a6bb-dae9a2a93585,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f838d84d-a8eb-4c45-945a-ea4621cd1928", "address": "fa:16:3e:55:be:87", "network": {"id": "173a131c-e6e2-4f94-a591-9a96ff4967f2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-195371137-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.77", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "38ec5db9e7c744dcb2d4ae6737822da4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf838d84d-a8", "ovs_interfaceid": "f838d84d-a8eb-4c45-945a-ea4621cd1928", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 11:05:43 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-73e07410-435a-46b0-841a-f80a1ebe6717 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Converting VIF {"id": "f838d84d-a8eb-4c45-945a-ea4621cd1928", "address": "fa:16:3e:55:be:87", "network": {"id": "173a131c-e6e2-4f94-a591-9a96ff4967f2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-195371137-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.77", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "38ec5db9e7c744dcb2d4ae6737822da4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf838d84d-a8", "ovs_interfaceid": "f838d84d-a8eb-4c45-945a-ea4621cd1928", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 11:05:43 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-73e07410-435a-46b0-841a-f80a1ebe6717 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:55:be:87,bridge_name='br-int',has_traffic_filtering=True,id=f838d84d-a8eb-4c45-945a-ea4621cd1928,network=Network(173a131c-e6e2-4f94-a591-9a96ff4967f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf838d84d-a8') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 11:05:43 user nova-compute[70954]: DEBUG os_vif [None req-73e07410-435a-46b0-841a-f80a1ebe6717 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:55:be:87,bridge_name='br-int',has_traffic_filtering=True,id=f838d84d-a8eb-4c45-945a-ea4621cd1928,network=Network(173a131c-e6e2-4f94-a591-9a96ff4967f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf838d84d-a8') {{(pid=70954) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 11:05:43 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:05:43 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf838d84d-a8, bridge=br-int, if_exists=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 11:05:43 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:05:43 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:05:43 user nova-compute[70954]: INFO os_vif [None req-73e07410-435a-46b0-841a-f80a1ebe6717 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:55:be:87,bridge_name='br-int',has_traffic_filtering=True,id=f838d84d-a8eb-4c45-945a-ea4621cd1928,network=Network(173a131c-e6e2-4f94-a591-9a96ff4967f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf838d84d-a8') Apr 21 11:05:43 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-73e07410-435a-46b0-841a-f80a1ebe6717 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Deleting instance files /opt/stack/data/nova/instances/4dc4e5a5-6f31-4466-a6bb-dae9a2a93585_del Apr 21 11:05:43 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-73e07410-435a-46b0-841a-f80a1ebe6717 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Deletion of /opt/stack/data/nova/instances/4dc4e5a5-6f31-4466-a6bb-dae9a2a93585_del complete Apr 21 11:05:44 user nova-compute[70954]: INFO nova.compute.manager [None req-73e07410-435a-46b0-841a-f80a1ebe6717 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Took 0.85 seconds to destroy the instance on the hypervisor. Apr 21 11:05:44 user nova-compute[70954]: DEBUG oslo.service.loopingcall [None req-73e07410-435a-46b0-841a-f80a1ebe6717 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70954) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 11:05:44 user nova-compute[70954]: DEBUG nova.compute.manager [-] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Deallocating network for instance {{(pid=70954) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 11:05:44 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] deallocate_for_instance() {{(pid=70954) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 11:05:45 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Updating instance_info_cache with network_info: [] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 11:05:45 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Took 1.21 seconds to deallocate network for instance. Apr 21 11:05:45 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-73e07410-435a-46b0-841a-f80a1ebe6717 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:05:45 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-73e07410-435a-46b0-841a-f80a1ebe6717 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:05:45 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-73e07410-435a-46b0-841a-f80a1ebe6717 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 11:05:45 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-73e07410-435a-46b0-841a-f80a1ebe6717 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 11:05:45 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-73e07410-435a-46b0-841a-f80a1ebe6717 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.114s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:05:45 user nova-compute[70954]: INFO nova.scheduler.client.report [None req-73e07410-435a-46b0-841a-f80a1ebe6717 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Deleted allocations for instance 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585 Apr 21 11:05:45 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-73e07410-435a-46b0-841a-f80a1ebe6717 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Lock "4dc4e5a5-6f31-4466-a6bb-dae9a2a93585" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.359s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:05:46 user nova-compute[70954]: DEBUG nova.compute.manager [req-399c01a4-b8bc-41e7-b5b3-eb64f59078e9 req-22f60dd3-1e3a-49d5-b3d0-508d1a1af94f service nova] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Received event network-vif-plugged-f838d84d-a8eb-4c45-945a-ea4621cd1928 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 11:05:46 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-399c01a4-b8bc-41e7-b5b3-eb64f59078e9 req-22f60dd3-1e3a-49d5-b3d0-508d1a1af94f service nova] Acquiring lock "4dc4e5a5-6f31-4466-a6bb-dae9a2a93585-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:05:46 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-399c01a4-b8bc-41e7-b5b3-eb64f59078e9 req-22f60dd3-1e3a-49d5-b3d0-508d1a1af94f service nova] Lock "4dc4e5a5-6f31-4466-a6bb-dae9a2a93585-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:05:46 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-399c01a4-b8bc-41e7-b5b3-eb64f59078e9 req-22f60dd3-1e3a-49d5-b3d0-508d1a1af94f service nova] Lock "4dc4e5a5-6f31-4466-a6bb-dae9a2a93585-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:05:46 user nova-compute[70954]: DEBUG nova.compute.manager [req-399c01a4-b8bc-41e7-b5b3-eb64f59078e9 req-22f60dd3-1e3a-49d5-b3d0-508d1a1af94f service nova] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] No waiting events found dispatching network-vif-plugged-f838d84d-a8eb-4c45-945a-ea4621cd1928 {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 11:05:46 user nova-compute[70954]: WARNING nova.compute.manager [req-399c01a4-b8bc-41e7-b5b3-eb64f59078e9 req-22f60dd3-1e3a-49d5-b3d0-508d1a1af94f service nova] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Received unexpected event network-vif-plugged-f838d84d-a8eb-4c45-945a-ea4621cd1928 for instance with vm_state deleted and task_state None. Apr 21 11:05:46 user nova-compute[70954]: DEBUG nova.compute.manager [req-399c01a4-b8bc-41e7-b5b3-eb64f59078e9 req-22f60dd3-1e3a-49d5-b3d0-508d1a1af94f service nova] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Received event network-vif-deleted-f838d84d-a8eb-4c45-945a-ea4621cd1928 {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 11:05:48 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:05:48 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:05:53 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:05:58 user nova-compute[70954]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 11:05:58 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] VM Stopped (Lifecycle Event) Apr 21 11:05:58 user nova-compute[70954]: DEBUG nova.compute.manager [None req-efb35b52-8051-4c16-800c-f8df97208674 None None] [instance: 4dc4e5a5-6f31-4466-a6bb-dae9a2a93585] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 11:05:58 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:06:03 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:06:03 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:06:03 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=70954) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 21 11:06:03 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 11:06:03 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 11:06:03 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:06:04 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:06:05 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:06:08 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:06:12 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:06:12 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Starting heal instance info cache {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 11:06:12 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Rebuilding the list of instances to heal {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 21 11:06:12 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Didn't find any instances for network info cache update. {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 21 11:06:12 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:06:13 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:06:15 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:06:15 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:06:16 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager.update_available_resource {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:06:16 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:06:16 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:06:16 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:06:16 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Auditing locally available compute resources for user (node: user) {{(pid=70954) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 11:06:17 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 11:06:17 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 11:06:17 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Hypervisor/Node resource view: name=user free_ram=9189MB free_disk=26.550880432128906GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70954) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 11:06:17 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:06:17 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:06:17 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 11:06:17 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 11:06:17 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 11:06:17 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 11:06:17 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Compute_service record updated for user:user {{(pid=70954) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 11:06:17 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.187s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:06:18 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:06:19 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:06:19 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:06:20 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:06:23 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:06:27 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:06:27 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70954) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 11:06:28 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:06:33 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:06:33 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:06:33 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=70954) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 21 11:06:33 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 11:06:33 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 11:06:33 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:06:38 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Acquiring lock "09c483c1-f6c1-4529-afc7-b5774df793ab" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:06:38 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Lock "09c483c1-f6c1-4529-afc7-b5774df793ab" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:06:38 user nova-compute[70954]: DEBUG nova.compute.manager [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Starting instance... {{(pid=70954) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 11:06:38 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:06:38 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:06:38 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70954) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 11:06:38 user nova-compute[70954]: INFO nova.compute.claims [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Claim successful on node user Apr 21 11:06:38 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 11:06:38 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 11:06:38 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.207s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:06:38 user nova-compute[70954]: DEBUG nova.compute.manager [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Start building networks asynchronously for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 11:06:38 user nova-compute[70954]: DEBUG nova.compute.manager [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Allocating IP information in the background. {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 11:06:38 user nova-compute[70954]: DEBUG nova.network.neutron [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] allocate_for_instance() {{(pid=70954) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 11:06:38 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 11:06:38 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:06:38 user nova-compute[70954]: DEBUG nova.compute.manager [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Start building block device mappings for instance. {{(pid=70954) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 11:06:39 user nova-compute[70954]: DEBUG nova.policy [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b3ef4e7c36ed43d9a00f7b7b9731917e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '38ec5db9e7c744dcb2d4ae6737822da4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70954) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 11:06:39 user nova-compute[70954]: DEBUG nova.compute.manager [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Start spawning the instance on the hypervisor. {{(pid=70954) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 11:06:39 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Creating instance directory {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 11:06:39 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Creating image(s) Apr 21 11:06:39 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Acquiring lock "/opt/stack/data/nova/instances/09c483c1-f6c1-4529-afc7-b5774df793ab/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:06:39 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Lock "/opt/stack/data/nova/instances/09c483c1-f6c1-4529-afc7-b5774df793ab/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:06:39 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Lock "/opt/stack/data/nova/instances/09c483c1-f6c1-4529-afc7-b5774df793ab/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:06:39 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:06:39 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.165s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:06:39 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Acquiring lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:06:39 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:06:39 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:06:39 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.148s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:06:39 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/09c483c1-f6c1-4529-afc7-b5774df793ab/disk 1073741824 {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:06:39 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a,backing_fmt=raw /opt/stack/data/nova/instances/09c483c1-f6c1-4529-afc7-b5774df793ab/disk 1073741824" returned: 0 in 0.050s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:06:39 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Lock "7bbc41f624e00c3d4643ded3a7dfea532adc4b4a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.205s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:06:39 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:06:39 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7bbc41f624e00c3d4643ded3a7dfea532adc4b4a --force-share --output=json" returned: 0 in 0.139s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:06:39 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Checking if we can resize image /opt/stack/data/nova/instances/09c483c1-f6c1-4529-afc7-b5774df793ab/disk. size=1073741824 {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 11:06:39 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/09c483c1-f6c1-4529-afc7-b5774df793ab/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:06:39 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/09c483c1-f6c1-4529-afc7-b5774df793ab/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:06:39 user nova-compute[70954]: DEBUG nova.virt.disk.api [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Cannot resize image /opt/stack/data/nova/instances/09c483c1-f6c1-4529-afc7-b5774df793ab/disk to a smaller size. {{(pid=70954) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 11:06:39 user nova-compute[70954]: DEBUG nova.objects.instance [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Lazy-loading 'migration_context' on Instance uuid 09c483c1-f6c1-4529-afc7-b5774df793ab {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 11:06:39 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Created local disks {{(pid=70954) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 11:06:39 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Ensure instance console log exists: /opt/stack/data/nova/instances/09c483c1-f6c1-4529-afc7-b5774df793ab/console.log {{(pid=70954) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 11:06:39 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:06:39 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:06:39 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:06:39 user nova-compute[70954]: DEBUG nova.network.neutron [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Successfully created port: bad0b018-583e-4271-8585-5c56eba9fd6b {{(pid=70954) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 11:06:40 user nova-compute[70954]: DEBUG nova.network.neutron [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Successfully updated port: bad0b018-583e-4271-8585-5c56eba9fd6b {{(pid=70954) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 11:06:40 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Acquiring lock "refresh_cache-09c483c1-f6c1-4529-afc7-b5774df793ab" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 11:06:40 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Acquired lock "refresh_cache-09c483c1-f6c1-4529-afc7-b5774df793ab" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 11:06:40 user nova-compute[70954]: DEBUG nova.network.neutron [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Building network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 11:06:40 user nova-compute[70954]: DEBUG nova.compute.manager [req-6635995f-bccd-41c4-9eb1-e9a33f7a3dd5 req-967d73a5-2bde-464f-aaa4-ef59ee720f87 service nova] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Received event network-changed-bad0b018-583e-4271-8585-5c56eba9fd6b {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 11:06:40 user nova-compute[70954]: DEBUG nova.compute.manager [req-6635995f-bccd-41c4-9eb1-e9a33f7a3dd5 req-967d73a5-2bde-464f-aaa4-ef59ee720f87 service nova] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Refreshing instance network info cache due to event network-changed-bad0b018-583e-4271-8585-5c56eba9fd6b. {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 11:06:40 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-6635995f-bccd-41c4-9eb1-e9a33f7a3dd5 req-967d73a5-2bde-464f-aaa4-ef59ee720f87 service nova] Acquiring lock "refresh_cache-09c483c1-f6c1-4529-afc7-b5774df793ab" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 11:06:40 user nova-compute[70954]: DEBUG nova.network.neutron [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Instance cache missing network info. {{(pid=70954) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 11:06:40 user nova-compute[70954]: DEBUG nova.network.neutron [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Updating instance_info_cache with network_info: [{"id": "bad0b018-583e-4271-8585-5c56eba9fd6b", "address": "fa:16:3e:1e:b1:80", "network": {"id": "173a131c-e6e2-4f94-a591-9a96ff4967f2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-195371137-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "38ec5db9e7c744dcb2d4ae6737822da4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbad0b018-58", "ovs_interfaceid": "bad0b018-583e-4271-8585-5c56eba9fd6b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 11:06:40 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Releasing lock "refresh_cache-09c483c1-f6c1-4529-afc7-b5774df793ab" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 11:06:40 user nova-compute[70954]: DEBUG nova.compute.manager [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Instance network_info: |[{"id": "bad0b018-583e-4271-8585-5c56eba9fd6b", "address": "fa:16:3e:1e:b1:80", "network": {"id": "173a131c-e6e2-4f94-a591-9a96ff4967f2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-195371137-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "38ec5db9e7c744dcb2d4ae6737822da4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbad0b018-58", "ovs_interfaceid": "bad0b018-583e-4271-8585-5c56eba9fd6b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70954) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 11:06:40 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-6635995f-bccd-41c4-9eb1-e9a33f7a3dd5 req-967d73a5-2bde-464f-aaa4-ef59ee720f87 service nova] Acquired lock "refresh_cache-09c483c1-f6c1-4529-afc7-b5774df793ab" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 11:06:40 user nova-compute[70954]: DEBUG nova.network.neutron [req-6635995f-bccd-41c4-9eb1-e9a33f7a3dd5 req-967d73a5-2bde-464f-aaa4-ef59ee720f87 service nova] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Refreshing network info cache for port bad0b018-583e-4271-8585-5c56eba9fd6b {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 11:06:40 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Start _get_guest_xml network_info=[{"id": "bad0b018-583e-4271-8585-5c56eba9fd6b", "address": "fa:16:3e:1e:b1:80", "network": {"id": "173a131c-e6e2-4f94-a591-9a96ff4967f2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-195371137-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "38ec5db9e7c744dcb2d4ae6737822da4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbad0b018-58", "ovs_interfaceid": "bad0b018-583e-4271-8585-5c56eba9fd6b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:43:25Z,direct_url=,disk_format='qcow2',id=3b29a01a-1fc0-4d0d-89fb-23d22b2de02e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='a3109aa78f014d0da3638064a889676d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:43:26Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'size': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'disk_bus': 'virtio', 'encryption_format': None, 'guest_format': None, 'image_id': '3b29a01a-1fc0-4d0d-89fb-23d22b2de02e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 11:06:40 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 11:06:40 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 11:06:40 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70954) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 11:06:40 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T10:44:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T10:43:25Z,direct_url=,disk_format='qcow2',id=3b29a01a-1fc0-4d0d-89fb-23d22b2de02e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='a3109aa78f014d0da3638064a889676d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T10:43:26Z,virtual_size=,visibility=), allow threads: True {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 11:06:40 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Flavor limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 11:06:40 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Image limits 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 11:06:40 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Flavor pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 11:06:40 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Image pref 0:0:0 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 11:06:40 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70954) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 11:06:40 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 11:06:40 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 11:06:40 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Got 1 possible topologies {{(pid=70954) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 11:06:40 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 11:06:40 user nova-compute[70954]: DEBUG nova.virt.hardware [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70954) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 11:06:41 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T11:06:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-1268614226',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-1268614226',id=25,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA1UfeAU5ejPvKiIbVrSlezxGGpVC4gfHz7J9grOoCm6xuHHRWZ2v1kz4ntP7G17kSW98hRXpmZVLOyqlGHRYSA8xjvWuXXo/irIIn270UPQGiMFwjijpnxL1y5wwHMThA==',key_name='tempest-keypair-1706068318',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='38ec5db9e7c744dcb2d4ae6737822da4',ramdisk_id='',reservation_id='r-mo8cy3gp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-1477853719',owner_user_name='tempest-AttachVolumeShelveTestJSON-1477853719-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T11:06:39Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b3ef4e7c36ed43d9a00f7b7b9731917e',uuid=09c483c1-f6c1-4529-afc7-b5774df793ab,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bad0b018-583e-4271-8585-5c56eba9fd6b", "address": "fa:16:3e:1e:b1:80", "network": {"id": "173a131c-e6e2-4f94-a591-9a96ff4967f2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-195371137-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "38ec5db9e7c744dcb2d4ae6737822da4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbad0b018-58", "ovs_interfaceid": "bad0b018-583e-4271-8585-5c56eba9fd6b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70954) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 11:06:41 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Converting VIF {"id": "bad0b018-583e-4271-8585-5c56eba9fd6b", "address": "fa:16:3e:1e:b1:80", "network": {"id": "173a131c-e6e2-4f94-a591-9a96ff4967f2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-195371137-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "38ec5db9e7c744dcb2d4ae6737822da4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbad0b018-58", "ovs_interfaceid": "bad0b018-583e-4271-8585-5c56eba9fd6b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 11:06:41 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:b1:80,bridge_name='br-int',has_traffic_filtering=True,id=bad0b018-583e-4271-8585-5c56eba9fd6b,network=Network(173a131c-e6e2-4f94-a591-9a96ff4967f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbad0b018-58') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 11:06:41 user nova-compute[70954]: DEBUG nova.objects.instance [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Lazy-loading 'pci_devices' on Instance uuid 09c483c1-f6c1-4529-afc7-b5774df793ab {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 11:06:41 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] End _get_guest_xml xml= Apr 21 11:06:41 user nova-compute[70954]: 09c483c1-f6c1-4529-afc7-b5774df793ab Apr 21 11:06:41 user nova-compute[70954]: instance-00000019 Apr 21 11:06:41 user nova-compute[70954]: 131072 Apr 21 11:06:41 user nova-compute[70954]: 1 Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: tempest-AttachVolumeShelveTestJSON-server-1268614226 Apr 21 11:06:41 user nova-compute[70954]: 2023-04-21 11:06:40 Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: 128 Apr 21 11:06:41 user nova-compute[70954]: 1 Apr 21 11:06:41 user nova-compute[70954]: 0 Apr 21 11:06:41 user nova-compute[70954]: 0 Apr 21 11:06:41 user nova-compute[70954]: 1 Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: tempest-AttachVolumeShelveTestJSON-1477853719-project-member Apr 21 11:06:41 user nova-compute[70954]: tempest-AttachVolumeShelveTestJSON-1477853719 Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: OpenStack Foundation Apr 21 11:06:41 user nova-compute[70954]: OpenStack Nova Apr 21 11:06:41 user nova-compute[70954]: 0.0.0 Apr 21 11:06:41 user nova-compute[70954]: 09c483c1-f6c1-4529-afc7-b5774df793ab Apr 21 11:06:41 user nova-compute[70954]: 09c483c1-f6c1-4529-afc7-b5774df793ab Apr 21 11:06:41 user nova-compute[70954]: Virtual Machine Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: hvm Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Nehalem Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: /dev/urandom Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: Apr 21 11:06:41 user nova-compute[70954]: {{(pid=70954) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 11:06:41 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T11:06:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-1268614226',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-1268614226',id=25,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA1UfeAU5ejPvKiIbVrSlezxGGpVC4gfHz7J9grOoCm6xuHHRWZ2v1kz4ntP7G17kSW98hRXpmZVLOyqlGHRYSA8xjvWuXXo/irIIn270UPQGiMFwjijpnxL1y5wwHMThA==',key_name='tempest-keypair-1706068318',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='38ec5db9e7c744dcb2d4ae6737822da4',ramdisk_id='',reservation_id='r-mo8cy3gp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-1477853719',owner_user_name='tempest-AttachVolumeShelveTestJSON-1477853719-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T11:06:39Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b3ef4e7c36ed43d9a00f7b7b9731917e',uuid=09c483c1-f6c1-4529-afc7-b5774df793ab,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bad0b018-583e-4271-8585-5c56eba9fd6b", "address": "fa:16:3e:1e:b1:80", "network": {"id": "173a131c-e6e2-4f94-a591-9a96ff4967f2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-195371137-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "38ec5db9e7c744dcb2d4ae6737822da4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbad0b018-58", "ovs_interfaceid": "bad0b018-583e-4271-8585-5c56eba9fd6b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 11:06:41 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Converting VIF {"id": "bad0b018-583e-4271-8585-5c56eba9fd6b", "address": "fa:16:3e:1e:b1:80", "network": {"id": "173a131c-e6e2-4f94-a591-9a96ff4967f2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-195371137-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "38ec5db9e7c744dcb2d4ae6737822da4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbad0b018-58", "ovs_interfaceid": "bad0b018-583e-4271-8585-5c56eba9fd6b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 11:06:41 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:b1:80,bridge_name='br-int',has_traffic_filtering=True,id=bad0b018-583e-4271-8585-5c56eba9fd6b,network=Network(173a131c-e6e2-4f94-a591-9a96ff4967f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbad0b018-58') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 11:06:41 user nova-compute[70954]: DEBUG os_vif [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:b1:80,bridge_name='br-int',has_traffic_filtering=True,id=bad0b018-583e-4271-8585-5c56eba9fd6b,network=Network(173a131c-e6e2-4f94-a591-9a96ff4967f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbad0b018-58') {{(pid=70954) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 11:06:41 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:06:41 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 11:06:41 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 11:06:41 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:06:41 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbad0b018-58, may_exist=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 11:06:41 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbad0b018-58, col_values=(('external_ids', {'iface-id': 'bad0b018-583e-4271-8585-5c56eba9fd6b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1e:b1:80', 'vm-uuid': '09c483c1-f6c1-4529-afc7-b5774df793ab'}),)) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 11:06:41 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:06:41 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:06:41 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:06:41 user nova-compute[70954]: INFO os_vif [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:b1:80,bridge_name='br-int',has_traffic_filtering=True,id=bad0b018-583e-4271-8585-5c56eba9fd6b,network=Network(173a131c-e6e2-4f94-a591-9a96ff4967f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbad0b018-58') Apr 21 11:06:41 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] No BDM found with device name vda, not building metadata. {{(pid=70954) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 11:06:41 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] No VIF found with MAC fa:16:3e:1e:b1:80, not building metadata {{(pid=70954) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 11:06:41 user nova-compute[70954]: DEBUG nova.network.neutron [req-6635995f-bccd-41c4-9eb1-e9a33f7a3dd5 req-967d73a5-2bde-464f-aaa4-ef59ee720f87 service nova] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Updated VIF entry in instance network info cache for port bad0b018-583e-4271-8585-5c56eba9fd6b. {{(pid=70954) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 11:06:41 user nova-compute[70954]: DEBUG nova.network.neutron [req-6635995f-bccd-41c4-9eb1-e9a33f7a3dd5 req-967d73a5-2bde-464f-aaa4-ef59ee720f87 service nova] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Updating instance_info_cache with network_info: [{"id": "bad0b018-583e-4271-8585-5c56eba9fd6b", "address": "fa:16:3e:1e:b1:80", "network": {"id": "173a131c-e6e2-4f94-a591-9a96ff4967f2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-195371137-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "38ec5db9e7c744dcb2d4ae6737822da4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbad0b018-58", "ovs_interfaceid": "bad0b018-583e-4271-8585-5c56eba9fd6b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 11:06:41 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-6635995f-bccd-41c4-9eb1-e9a33f7a3dd5 req-967d73a5-2bde-464f-aaa4-ef59ee720f87 service nova] Releasing lock "refresh_cache-09c483c1-f6c1-4529-afc7-b5774df793ab" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 11:06:42 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:06:42 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:06:42 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:06:42 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:06:42 user nova-compute[70954]: DEBUG nova.compute.manager [req-eb112f1b-9a13-4645-a682-57b67b5d1e9d req-90c4ee9e-c689-4a43-a6d1-78146af44242 service nova] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Received event network-vif-plugged-bad0b018-583e-4271-8585-5c56eba9fd6b {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 11:06:42 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-eb112f1b-9a13-4645-a682-57b67b5d1e9d req-90c4ee9e-c689-4a43-a6d1-78146af44242 service nova] Acquiring lock "09c483c1-f6c1-4529-afc7-b5774df793ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:06:42 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-eb112f1b-9a13-4645-a682-57b67b5d1e9d req-90c4ee9e-c689-4a43-a6d1-78146af44242 service nova] Lock "09c483c1-f6c1-4529-afc7-b5774df793ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:06:42 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-eb112f1b-9a13-4645-a682-57b67b5d1e9d req-90c4ee9e-c689-4a43-a6d1-78146af44242 service nova] Lock "09c483c1-f6c1-4529-afc7-b5774df793ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:06:42 user nova-compute[70954]: DEBUG nova.compute.manager [req-eb112f1b-9a13-4645-a682-57b67b5d1e9d req-90c4ee9e-c689-4a43-a6d1-78146af44242 service nova] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] No waiting events found dispatching network-vif-plugged-bad0b018-583e-4271-8585-5c56eba9fd6b {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 11:06:42 user nova-compute[70954]: WARNING nova.compute.manager [req-eb112f1b-9a13-4645-a682-57b67b5d1e9d req-90c4ee9e-c689-4a43-a6d1-78146af44242 service nova] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Received unexpected event network-vif-plugged-bad0b018-583e-4271-8585-5c56eba9fd6b for instance with vm_state building and task_state spawning. Apr 21 11:06:42 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:06:42 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:06:42 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:06:43 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:06:44 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Resumed> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 11:06:44 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] VM Resumed (Lifecycle Event) Apr 21 11:06:44 user nova-compute[70954]: DEBUG nova.compute.manager [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Instance event wait completed in 0 seconds for {{(pid=70954) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 11:06:44 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Guest created on hypervisor {{(pid=70954) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 11:06:44 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Instance spawned successfully. Apr 21 11:06:44 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 11:06:44 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 11:06:44 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 11:06:44 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Found default for hw_cdrom_bus of ide {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 11:06:44 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Found default for hw_disk_bus of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 11:06:44 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Found default for hw_input_bus of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 11:06:44 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Found default for hw_pointer_model of None {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 11:06:44 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Found default for hw_video_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 11:06:44 user nova-compute[70954]: DEBUG nova.virt.libvirt.driver [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Found default for hw_vif_model of virtio {{(pid=70954) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 11:06:44 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 11:06:44 user nova-compute[70954]: DEBUG nova.virt.driver [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] Emitting event Started> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 11:06:44 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] VM Started (Lifecycle Event) Apr 21 11:06:44 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 11:06:44 user nova-compute[70954]: DEBUG nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70954) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 11:06:44 user nova-compute[70954]: INFO nova.compute.manager [None req-835edc73-f1a8-4826-8143-8b7ed83d7f23 None None] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 11:06:44 user nova-compute[70954]: INFO nova.compute.manager [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Took 5.47 seconds to spawn the instance on the hypervisor. Apr 21 11:06:44 user nova-compute[70954]: DEBUG nova.compute.manager [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 11:06:44 user nova-compute[70954]: INFO nova.compute.manager [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Took 5.98 seconds to build instance. Apr 21 11:06:44 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-8c6603c4-c6ad-45f2-95cb-d09722a0c7e6 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Lock "09c483c1-f6c1-4529-afc7-b5774df793ab" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.083s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:06:44 user nova-compute[70954]: DEBUG nova.compute.manager [req-f5dd356d-fb5b-4364-b5d7-c7ac966ded19 req-f7b72c3b-4ec5-4454-8c25-41c7c441448e service nova] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Received event network-vif-plugged-bad0b018-583e-4271-8585-5c56eba9fd6b {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 11:06:44 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-f5dd356d-fb5b-4364-b5d7-c7ac966ded19 req-f7b72c3b-4ec5-4454-8c25-41c7c441448e service nova] Acquiring lock "09c483c1-f6c1-4529-afc7-b5774df793ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:06:44 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-f5dd356d-fb5b-4364-b5d7-c7ac966ded19 req-f7b72c3b-4ec5-4454-8c25-41c7c441448e service nova] Lock "09c483c1-f6c1-4529-afc7-b5774df793ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:06:44 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-f5dd356d-fb5b-4364-b5d7-c7ac966ded19 req-f7b72c3b-4ec5-4454-8c25-41c7c441448e service nova] Lock "09c483c1-f6c1-4529-afc7-b5774df793ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:06:44 user nova-compute[70954]: DEBUG nova.compute.manager [req-f5dd356d-fb5b-4364-b5d7-c7ac966ded19 req-f7b72c3b-4ec5-4454-8c25-41c7c441448e service nova] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] No waiting events found dispatching network-vif-plugged-bad0b018-583e-4271-8585-5c56eba9fd6b {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 11:06:44 user nova-compute[70954]: WARNING nova.compute.manager [req-f5dd356d-fb5b-4364-b5d7-c7ac966ded19 req-f7b72c3b-4ec5-4454-8c25-41c7c441448e service nova] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Received unexpected event network-vif-plugged-bad0b018-583e-4271-8585-5c56eba9fd6b for instance with vm_state active and task_state None. Apr 21 11:06:46 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:06:48 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:06:51 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:06:56 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:06:58 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:07:01 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:07:03 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:07:06 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:07:11 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:07:11 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:07:11 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=70954) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 21 11:07:11 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 11:07:11 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 11:07:11 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:07:13 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:07:13 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:07:13 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Starting heal instance info cache {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 11:07:13 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Rebuilding the list of instances to heal {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 21 11:07:13 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "refresh_cache-09c483c1-f6c1-4529-afc7-b5774df793ab" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 11:07:13 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquired lock "refresh_cache-09c483c1-f6c1-4529-afc7-b5774df793ab" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 11:07:13 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Forcefully refreshing network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 21 11:07:13 user nova-compute[70954]: DEBUG nova.objects.instance [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lazy-loading 'info_cache' on Instance uuid 09c483c1-f6c1-4529-afc7-b5774df793ab {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 11:07:14 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Updating instance_info_cache with network_info: [{"id": "bad0b018-583e-4271-8585-5c56eba9fd6b", "address": "fa:16:3e:1e:b1:80", "network": {"id": "173a131c-e6e2-4f94-a591-9a96ff4967f2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-195371137-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "38ec5db9e7c744dcb2d4ae6737822da4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbad0b018-58", "ovs_interfaceid": "bad0b018-583e-4271-8585-5c56eba9fd6b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 11:07:14 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Releasing lock "refresh_cache-09c483c1-f6c1-4529-afc7-b5774df793ab" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 11:07:14 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Updated the network info_cache for instance {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 21 11:07:14 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:07:15 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:07:15 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:07:16 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:07:17 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager.update_available_resource {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:07:17 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:07:17 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:07:17 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:07:17 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Auditing locally available compute resources for user (node: user) {{(pid=70954) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 11:07:17 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/09c483c1-f6c1-4529-afc7-b5774df793ab/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:07:18 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/09c483c1-f6c1-4529-afc7-b5774df793ab/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:07:18 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/09c483c1-f6c1-4529-afc7-b5774df793ab/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:07:18 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/09c483c1-f6c1-4529-afc7-b5774df793ab/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:07:18 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 11:07:18 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 11:07:18 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Hypervisor/Node resource view: name=user free_ram=9111MB free_disk=26.528270721435547GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70954) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 11:07:18 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:07:18 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:07:18 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 09c483c1-f6c1-4529-afc7-b5774df793ab actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 11:07:18 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 11:07:18 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 11:07:18 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 11:07:18 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 11:07:18 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Compute_service record updated for user:user {{(pid=70954) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 11:07:18 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.193s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:07:19 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:07:19 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:07:21 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:07:21 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:07:21 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:07:26 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:07:28 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:07:28 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70954) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 11:07:31 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:07:33 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:07:36 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:07:41 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:07:46 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:07:46 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:07:46 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=70954) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 21 11:07:46 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 11:07:46 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 11:07:46 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:07:51 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:07:56 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:08:01 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:08:06 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:08:11 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:08:13 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:08:13 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:08:13 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Starting heal instance info cache {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 11:08:13 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Rebuilding the list of instances to heal {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 21 11:08:13 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "refresh_cache-09c483c1-f6c1-4529-afc7-b5774df793ab" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 11:08:13 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquired lock "refresh_cache-09c483c1-f6c1-4529-afc7-b5774df793ab" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 11:08:13 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Forcefully refreshing network info cache for instance {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 21 11:08:13 user nova-compute[70954]: DEBUG nova.objects.instance [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lazy-loading 'info_cache' on Instance uuid 09c483c1-f6c1-4529-afc7-b5774df793ab {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 11:08:14 user nova-compute[70954]: DEBUG nova.network.neutron [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Updating instance_info_cache with network_info: [{"id": "bad0b018-583e-4271-8585-5c56eba9fd6b", "address": "fa:16:3e:1e:b1:80", "network": {"id": "173a131c-e6e2-4f94-a591-9a96ff4967f2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-195371137-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "38ec5db9e7c744dcb2d4ae6737822da4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbad0b018-58", "ovs_interfaceid": "bad0b018-583e-4271-8585-5c56eba9fd6b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 11:08:14 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Releasing lock "refresh_cache-09c483c1-f6c1-4529-afc7-b5774df793ab" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 11:08:14 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Updated the network info_cache for instance {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 21 11:08:14 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:08:15 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:08:16 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:08:17 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:08:17 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager.update_available_resource {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:08:17 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:08:17 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:08:17 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:08:17 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Auditing locally available compute resources for user (node: user) {{(pid=70954) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 11:08:17 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/09c483c1-f6c1-4529-afc7-b5774df793ab/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:08:18 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/09c483c1-f6c1-4529-afc7-b5774df793ab/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:08:18 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/09c483c1-f6c1-4529-afc7-b5774df793ab/disk --force-share --output=json {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 11:08:18 user nova-compute[70954]: DEBUG oslo_concurrency.processutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/09c483c1-f6c1-4529-afc7-b5774df793ab/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=70954) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 11:08:18 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 11:08:18 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 11:08:18 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Hypervisor/Node resource view: name=user free_ram=9136MB free_disk=26.5277099609375GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70954) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 11:08:18 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:08:18 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:08:18 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Instance 09c483c1-f6c1-4529-afc7-b5774df793ab actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70954) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 11:08:18 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 11:08:18 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 11:08:18 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 11:08:18 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 11:08:18 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Compute_service record updated for user:user {{(pid=70954) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 11:08:18 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.173s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:08:20 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:08:20 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:08:21 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:08:22 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:08:22 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:08:22 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Cleaning up deleted instances with incomplete migration {{(pid=70954) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 21 11:08:26 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:08:26 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:08:26 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=70954) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 21 11:08:26 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 11:08:26 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 11:08:26 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:08:29 user nova-compute[70954]: DEBUG nova.compute.manager [req-c3be1789-f0c0-4564-a3e9-1d01ed765318 req-f294935c-6864-483d-814a-25a408997d30 service nova] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Received event network-changed-bad0b018-583e-4271-8585-5c56eba9fd6b {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 11:08:29 user nova-compute[70954]: DEBUG nova.compute.manager [req-c3be1789-f0c0-4564-a3e9-1d01ed765318 req-f294935c-6864-483d-814a-25a408997d30 service nova] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Refreshing instance network info cache due to event network-changed-bad0b018-583e-4271-8585-5c56eba9fd6b. {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 11:08:29 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-c3be1789-f0c0-4564-a3e9-1d01ed765318 req-f294935c-6864-483d-814a-25a408997d30 service nova] Acquiring lock "refresh_cache-09c483c1-f6c1-4529-afc7-b5774df793ab" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 11:08:29 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-c3be1789-f0c0-4564-a3e9-1d01ed765318 req-f294935c-6864-483d-814a-25a408997d30 service nova] Acquired lock "refresh_cache-09c483c1-f6c1-4529-afc7-b5774df793ab" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 11:08:29 user nova-compute[70954]: DEBUG nova.network.neutron [req-c3be1789-f0c0-4564-a3e9-1d01ed765318 req-f294935c-6864-483d-814a-25a408997d30 service nova] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Refreshing network info cache for port bad0b018-583e-4271-8585-5c56eba9fd6b {{(pid=70954) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 11:08:29 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:08:29 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70954) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 11:08:30 user nova-compute[70954]: DEBUG nova.network.neutron [req-c3be1789-f0c0-4564-a3e9-1d01ed765318 req-f294935c-6864-483d-814a-25a408997d30 service nova] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Updated VIF entry in instance network info cache for port bad0b018-583e-4271-8585-5c56eba9fd6b. {{(pid=70954) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 11:08:30 user nova-compute[70954]: DEBUG nova.network.neutron [req-c3be1789-f0c0-4564-a3e9-1d01ed765318 req-f294935c-6864-483d-814a-25a408997d30 service nova] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Updating instance_info_cache with network_info: [{"id": "bad0b018-583e-4271-8585-5c56eba9fd6b", "address": "fa:16:3e:1e:b1:80", "network": {"id": "173a131c-e6e2-4f94-a591-9a96ff4967f2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-195371137-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "38ec5db9e7c744dcb2d4ae6737822da4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbad0b018-58", "ovs_interfaceid": "bad0b018-583e-4271-8585-5c56eba9fd6b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 11:08:30 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-c3be1789-f0c0-4564-a3e9-1d01ed765318 req-f294935c-6864-483d-814a-25a408997d30 service nova] Releasing lock "refresh_cache-09c483c1-f6c1-4529-afc7-b5774df793ab" {{(pid=70954) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 11:08:31 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1bb85399-b5a7-4932-8371-1764a78e9e85 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Acquiring lock "09c483c1-f6c1-4529-afc7-b5774df793ab" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:08:31 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1bb85399-b5a7-4932-8371-1764a78e9e85 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Lock "09c483c1-f6c1-4529-afc7-b5774df793ab" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:08:31 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1bb85399-b5a7-4932-8371-1764a78e9e85 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Acquiring lock "09c483c1-f6c1-4529-afc7-b5774df793ab-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:08:31 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1bb85399-b5a7-4932-8371-1764a78e9e85 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Lock "09c483c1-f6c1-4529-afc7-b5774df793ab-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:08:31 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1bb85399-b5a7-4932-8371-1764a78e9e85 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Lock "09c483c1-f6c1-4529-afc7-b5774df793ab-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:08:31 user nova-compute[70954]: INFO nova.compute.manager [None req-1bb85399-b5a7-4932-8371-1764a78e9e85 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Terminating instance Apr 21 11:08:31 user nova-compute[70954]: DEBUG nova.compute.manager [None req-1bb85399-b5a7-4932-8371-1764a78e9e85 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Start destroying the instance on the hypervisor. {{(pid=70954) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 11:08:31 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:08:31 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:08:31 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:08:31 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:08:31 user nova-compute[70954]: DEBUG nova.compute.manager [req-de59ff41-a353-40f1-8e99-1cab4ff8a71f req-c2ea727e-b26a-4988-829e-5726071b8f17 service nova] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Received event network-vif-unplugged-bad0b018-583e-4271-8585-5c56eba9fd6b {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 11:08:31 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-de59ff41-a353-40f1-8e99-1cab4ff8a71f req-c2ea727e-b26a-4988-829e-5726071b8f17 service nova] Acquiring lock "09c483c1-f6c1-4529-afc7-b5774df793ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:08:31 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-de59ff41-a353-40f1-8e99-1cab4ff8a71f req-c2ea727e-b26a-4988-829e-5726071b8f17 service nova] Lock "09c483c1-f6c1-4529-afc7-b5774df793ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:08:31 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-de59ff41-a353-40f1-8e99-1cab4ff8a71f req-c2ea727e-b26a-4988-829e-5726071b8f17 service nova] Lock "09c483c1-f6c1-4529-afc7-b5774df793ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:08:31 user nova-compute[70954]: DEBUG nova.compute.manager [req-de59ff41-a353-40f1-8e99-1cab4ff8a71f req-c2ea727e-b26a-4988-829e-5726071b8f17 service nova] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] No waiting events found dispatching network-vif-unplugged-bad0b018-583e-4271-8585-5c56eba9fd6b {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 11:08:31 user nova-compute[70954]: DEBUG nova.compute.manager [req-de59ff41-a353-40f1-8e99-1cab4ff8a71f req-c2ea727e-b26a-4988-829e-5726071b8f17 service nova] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Received event network-vif-unplugged-bad0b018-583e-4271-8585-5c56eba9fd6b for instance with task_state deleting. {{(pid=70954) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 11:08:31 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:08:31 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:08:31 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:08:31 user nova-compute[70954]: INFO nova.virt.libvirt.driver [-] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Instance destroyed successfully. Apr 21 11:08:31 user nova-compute[70954]: DEBUG nova.objects.instance [None req-1bb85399-b5a7-4932-8371-1764a78e9e85 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Lazy-loading 'resources' on Instance uuid 09c483c1-f6c1-4529-afc7-b5774df793ab {{(pid=70954) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 11:08:31 user nova-compute[70954]: DEBUG nova.virt.libvirt.vif [None req-1bb85399-b5a7-4932-8371-1764a78e9e85 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T11:06:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-1268614226',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-1268614226',id=25,image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA1UfeAU5ejPvKiIbVrSlezxGGpVC4gfHz7J9grOoCm6xuHHRWZ2v1kz4ntP7G17kSW98hRXpmZVLOyqlGHRYSA8xjvWuXXo/irIIn270UPQGiMFwjijpnxL1y5wwHMThA==',key_name='tempest-keypair-1706068318',keypairs=,launch_index=0,launched_at=2023-04-21T11:06:44Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='38ec5db9e7c744dcb2d4ae6737822da4',ramdisk_id='',reservation_id='r-mo8cy3gp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3b29a01a-1fc0-4d0d-89fb-23d22b2de02e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeShelveTestJSON-1477853719',owner_user_name='tempest-AttachVolumeShelveTestJSON-1477853719-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T11:06:45Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b3ef4e7c36ed43d9a00f7b7b9731917e',uuid=09c483c1-f6c1-4529-afc7-b5774df793ab,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bad0b018-583e-4271-8585-5c56eba9fd6b", "address": "fa:16:3e:1e:b1:80", "network": {"id": "173a131c-e6e2-4f94-a591-9a96ff4967f2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-195371137-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "38ec5db9e7c744dcb2d4ae6737822da4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbad0b018-58", "ovs_interfaceid": "bad0b018-583e-4271-8585-5c56eba9fd6b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 11:08:31 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-1bb85399-b5a7-4932-8371-1764a78e9e85 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Converting VIF {"id": "bad0b018-583e-4271-8585-5c56eba9fd6b", "address": "fa:16:3e:1e:b1:80", "network": {"id": "173a131c-e6e2-4f94-a591-9a96ff4967f2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-195371137-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "38ec5db9e7c744dcb2d4ae6737822da4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbad0b018-58", "ovs_interfaceid": "bad0b018-583e-4271-8585-5c56eba9fd6b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 11:08:31 user nova-compute[70954]: DEBUG nova.network.os_vif_util [None req-1bb85399-b5a7-4932-8371-1764a78e9e85 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1e:b1:80,bridge_name='br-int',has_traffic_filtering=True,id=bad0b018-583e-4271-8585-5c56eba9fd6b,network=Network(173a131c-e6e2-4f94-a591-9a96ff4967f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbad0b018-58') {{(pid=70954) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 11:08:31 user nova-compute[70954]: DEBUG os_vif [None req-1bb85399-b5a7-4932-8371-1764a78e9e85 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:b1:80,bridge_name='br-int',has_traffic_filtering=True,id=bad0b018-583e-4271-8585-5c56eba9fd6b,network=Network(173a131c-e6e2-4f94-a591-9a96ff4967f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbad0b018-58') {{(pid=70954) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 11:08:31 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:08:31 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbad0b018-58, bridge=br-int, if_exists=True) {{(pid=70954) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 11:08:31 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:08:31 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:08:31 user nova-compute[70954]: INFO os_vif [None req-1bb85399-b5a7-4932-8371-1764a78e9e85 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:b1:80,bridge_name='br-int',has_traffic_filtering=True,id=bad0b018-583e-4271-8585-5c56eba9fd6b,network=Network(173a131c-e6e2-4f94-a591-9a96ff4967f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbad0b018-58') Apr 21 11:08:31 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-1bb85399-b5a7-4932-8371-1764a78e9e85 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Deleting instance files /opt/stack/data/nova/instances/09c483c1-f6c1-4529-afc7-b5774df793ab_del Apr 21 11:08:31 user nova-compute[70954]: INFO nova.virt.libvirt.driver [None req-1bb85399-b5a7-4932-8371-1764a78e9e85 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Deletion of /opt/stack/data/nova/instances/09c483c1-f6c1-4529-afc7-b5774df793ab_del complete Apr 21 11:08:31 user nova-compute[70954]: INFO nova.compute.manager [None req-1bb85399-b5a7-4932-8371-1764a78e9e85 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Took 0.70 seconds to destroy the instance on the hypervisor. Apr 21 11:08:31 user nova-compute[70954]: DEBUG oslo.service.loopingcall [None req-1bb85399-b5a7-4932-8371-1764a78e9e85 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70954) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 11:08:31 user nova-compute[70954]: DEBUG nova.compute.manager [-] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Deallocating network for instance {{(pid=70954) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 11:08:31 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] deallocate_for_instance() {{(pid=70954) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 11:08:32 user nova-compute[70954]: DEBUG nova.network.neutron [-] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Updating instance_info_cache with network_info: [] {{(pid=70954) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 11:08:32 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Took 0.87 seconds to deallocate network for instance. Apr 21 11:08:32 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1bb85399-b5a7-4932-8371-1764a78e9e85 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:08:32 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1bb85399-b5a7-4932-8371-1764a78e9e85 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:08:32 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-1bb85399-b5a7-4932-8371-1764a78e9e85 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 11:08:32 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-1bb85399-b5a7-4932-8371-1764a78e9e85 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 11:08:32 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1bb85399-b5a7-4932-8371-1764a78e9e85 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.120s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:08:32 user nova-compute[70954]: INFO nova.scheduler.client.report [None req-1bb85399-b5a7-4932-8371-1764a78e9e85 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Deleted allocations for instance 09c483c1-f6c1-4529-afc7-b5774df793ab Apr 21 11:08:32 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-1bb85399-b5a7-4932-8371-1764a78e9e85 tempest-AttachVolumeShelveTestJSON-1477853719 tempest-AttachVolumeShelveTestJSON-1477853719-project-member] Lock "09c483c1-f6c1-4529-afc7-b5774df793ab" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.858s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:08:33 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:08:33 user nova-compute[70954]: DEBUG nova.compute.manager [req-fadb9c34-0ca2-43f4-a9a0-368b0167294e req-fadb2b06-ab81-4533-94a3-bd4f22d89e5c service nova] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Received event network-vif-plugged-bad0b018-583e-4271-8585-5c56eba9fd6b {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 11:08:33 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-fadb9c34-0ca2-43f4-a9a0-368b0167294e req-fadb2b06-ab81-4533-94a3-bd4f22d89e5c service nova] Acquiring lock "09c483c1-f6c1-4529-afc7-b5774df793ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:08:33 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-fadb9c34-0ca2-43f4-a9a0-368b0167294e req-fadb2b06-ab81-4533-94a3-bd4f22d89e5c service nova] Lock "09c483c1-f6c1-4529-afc7-b5774df793ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:08:33 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [req-fadb9c34-0ca2-43f4-a9a0-368b0167294e req-fadb2b06-ab81-4533-94a3-bd4f22d89e5c service nova] Lock "09c483c1-f6c1-4529-afc7-b5774df793ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:08:33 user nova-compute[70954]: DEBUG nova.compute.manager [req-fadb9c34-0ca2-43f4-a9a0-368b0167294e req-fadb2b06-ab81-4533-94a3-bd4f22d89e5c service nova] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] No waiting events found dispatching network-vif-plugged-bad0b018-583e-4271-8585-5c56eba9fd6b {{(pid=70954) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 11:08:33 user nova-compute[70954]: WARNING nova.compute.manager [req-fadb9c34-0ca2-43f4-a9a0-368b0167294e req-fadb2b06-ab81-4533-94a3-bd4f22d89e5c service nova] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Received unexpected event network-vif-plugged-bad0b018-583e-4271-8585-5c56eba9fd6b for instance with vm_state deleted and task_state None. Apr 21 11:08:33 user nova-compute[70954]: DEBUG nova.compute.manager [req-fadb9c34-0ca2-43f4-a9a0-368b0167294e req-fadb2b06-ab81-4533-94a3-bd4f22d89e5c service nova] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Received event network-vif-deleted-bad0b018-583e-4271-8585-5c56eba9fd6b {{(pid=70954) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 11:08:33 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:08:33 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Cleaning up deleted instances {{(pid=70954) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 21 11:08:33 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] There are 0 instances to clean {{(pid=70954) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 21 11:08:34 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:08:36 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:08:41 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:08:46 user nova-compute[70954]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70954) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 11:08:46 user nova-compute[70954]: INFO nova.compute.manager [-] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] VM Stopped (Lifecycle Event) Apr 21 11:08:46 user nova-compute[70954]: DEBUG nova.compute.manager [None req-3883bdcd-c913-4c19-8d3a-1b8e2bac57f0 None None] [instance: 09c483c1-f6c1-4529-afc7-b5774df793ab] Checking state {{(pid=70954) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 11:08:46 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:08:46 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:08:46 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe {{(pid=70954) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 21 11:08:46 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 11:08:46 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 11:08:46 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:08:51 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:08:56 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:09:01 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:09:06 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:09:11 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:09:14 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:09:15 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:09:15 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Starting heal instance info cache {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 11:09:15 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Rebuilding the list of instances to heal {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 21 11:09:15 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Didn't find any instances for network info cache update. {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 21 11:09:15 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:09:16 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:09:16 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:09:16 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=70954) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 21 11:09:16 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 11:09:16 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 11:09:16 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:09:17 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:09:17 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager.update_available_resource {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:09:17 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:09:17 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:09:17 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:09:17 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Auditing locally available compute resources for user (node: user) {{(pid=70954) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 11:09:18 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 11:09:18 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 11:09:18 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Hypervisor/Node resource view: name=user free_ram=9232MB free_disk=26.538787841796875GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70954) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 11:09:18 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:09:18 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:09:18 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 11:09:18 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 11:09:18 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Refreshing inventories for resource provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 21 11:09:18 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Updating ProviderTree inventory for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 21 11:09:18 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Updating inventory in ProviderTree for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 21 11:09:18 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Refreshing aggregate associations for resource provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97, aggregates: None {{(pid=70954) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 21 11:09:18 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Refreshing trait associations for resource provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97, traits: COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,HW_CPU_X86_SSE,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_USB,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT {{(pid=70954) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 21 11:09:18 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 11:09:18 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 11:09:18 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Compute_service record updated for user:user {{(pid=70954) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 11:09:18 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.376s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:09:20 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:09:21 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:09:21 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:09:22 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:09:22 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:09:24 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:09:26 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:09:26 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:09:29 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:09:29 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70954) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 11:09:31 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:09:36 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:09:36 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:09:36 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=70954) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 21 11:09:36 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 11:09:36 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:09:36 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=70954) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 11:09:41 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:09:46 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:09:49 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:09:51 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:09:51 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:09:53 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:09:54 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:09:56 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:09:56 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:09:58 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:10:01 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:10:02 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:10:03 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:10:05 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:10:06 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:10:11 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:10:15 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:10:16 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 11:10:16 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:10:16 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Starting heal instance info cache {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 11:10:16 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Rebuilding the list of instances to heal {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 21 11:10:16 user nova-compute[70954]: DEBUG nova.compute.manager [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Didn't find any instances for network info cache update. {{(pid=70954) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 21 11:10:16 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:10:17 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager.update_available_resource {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:10:17 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:10:17 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:10:17 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:10:17 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Auditing locally available compute resources for user (node: user) {{(pid=70954) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 11:10:18 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 11:10:18 user nova-compute[70954]: WARNING nova.virt.libvirt.driver [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 11:10:18 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Hypervisor/Node resource view: name=user free_ram=9344MB free_disk=26.522567749023438GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70954) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 11:10:18 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 11:10:18 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 11:10:18 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 11:10:18 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=70954) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 11:10:18 user nova-compute[70954]: DEBUG nova.compute.provider_tree [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed in ProviderTree for provider: f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 {{(pid=70954) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 11:10:18 user nova-compute[70954]: DEBUG nova.scheduler.client.report [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Inventory has not changed for provider f5a93adf-7a38-4ac6-ba5b-d6a75e692e97 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70954) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 11:10:18 user nova-compute[70954]: DEBUG nova.compute.resource_tracker [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Compute_service record updated for user:user {{(pid=70954) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 11:10:18 user nova-compute[70954]: DEBUG oslo_concurrency.lockutils [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.115s {{(pid=70954) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 11:10:19 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:10:20 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:10:21 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 11:10:22 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:10:24 user nova-compute[70954]: DEBUG oslo_service.periodic_task [None req-d6d1a7f7-77a0-4e04-9a82-1e67de64fa18 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70954) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 11:10:26 user nova-compute[70954]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70954) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}}